00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-main" build number 3986 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3581 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.072 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.073 The recommended git tool is: git 00:00:00.073 using credential 00000000-0000-0000-0000-000000000002 00:00:00.074 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.106 Fetching changes from the remote Git repository 00:00:00.109 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.142 Using shallow fetch with depth 1 00:00:00.142 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.142 > git --version # timeout=10 00:00:00.182 > git --version # 'git version 2.39.2' 00:00:00.182 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.217 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.217 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:06.299 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:06.310 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:06.320 Checking out Revision 58e4f482292076ec19d68e6712473e60ef956aed (FETCH_HEAD) 00:00:06.320 > git config core.sparsecheckout # timeout=10 00:00:06.332 > git read-tree -mu HEAD # timeout=10 00:00:06.348 > git checkout -f 58e4f482292076ec19d68e6712473e60ef956aed # timeout=5 00:00:06.368 Commit message: "packer: Fix typo in a package name" 00:00:06.369 > git rev-list --no-walk 58e4f482292076ec19d68e6712473e60ef956aed # timeout=10 00:00:06.449 [Pipeline] Start of Pipeline 00:00:06.462 [Pipeline] library 00:00:06.464 Loading library shm_lib@master 00:00:06.464 Library shm_lib@master is cached. Copying from home. 00:00:06.480 [Pipeline] node 00:00:06.498 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:06.499 [Pipeline] { 00:00:06.507 [Pipeline] catchError 00:00:06.508 [Pipeline] { 00:00:06.517 [Pipeline] wrap 00:00:06.525 [Pipeline] { 00:00:06.531 [Pipeline] stage 00:00:06.532 [Pipeline] { (Prologue) 00:00:06.770 [Pipeline] sh 00:00:07.052 + logger -p user.info -t JENKINS-CI 00:00:07.069 [Pipeline] echo 00:00:07.070 Node: WFP20 00:00:07.080 [Pipeline] sh 00:00:07.381 [Pipeline] setCustomBuildProperty 00:00:07.391 [Pipeline] echo 00:00:07.393 Cleanup processes 00:00:07.396 [Pipeline] sh 00:00:07.678 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.678 3176226 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.691 [Pipeline] sh 00:00:07.978 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.978 ++ grep -v 'sudo pgrep' 00:00:07.978 ++ awk '{print $1}' 00:00:07.979 + sudo kill -9 00:00:07.979 + true 00:00:07.994 [Pipeline] cleanWs 00:00:08.008 [WS-CLEANUP] Deleting project workspace... 00:00:08.009 [WS-CLEANUP] Deferred wipeout is used... 00:00:08.050 [WS-CLEANUP] done 00:00:08.063 [Pipeline] setCustomBuildProperty 00:00:08.081 [Pipeline] sh 00:00:08.363 + sudo git config --global --replace-all safe.directory '*' 00:00:08.477 [Pipeline] httpRequest 00:00:09.198 [Pipeline] echo 00:00:09.200 Sorcerer 10.211.164.101 is alive 00:00:09.208 [Pipeline] retry 00:00:09.210 [Pipeline] { 00:00:09.224 [Pipeline] httpRequest 00:00:09.228 HttpMethod: GET 00:00:09.229 URL: http://10.211.164.101/packages/jbp_58e4f482292076ec19d68e6712473e60ef956aed.tar.gz 00:00:09.229 Sending request to url: http://10.211.164.101/packages/jbp_58e4f482292076ec19d68e6712473e60ef956aed.tar.gz 00:00:09.231 Response Code: HTTP/1.1 200 OK 00:00:09.231 Success: Status code 200 is in the accepted range: 200,404 00:00:09.232 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_58e4f482292076ec19d68e6712473e60ef956aed.tar.gz 00:00:10.392 [Pipeline] } 00:00:10.412 [Pipeline] // retry 00:00:10.420 [Pipeline] sh 00:00:10.706 + tar --no-same-owner -xf jbp_58e4f482292076ec19d68e6712473e60ef956aed.tar.gz 00:00:10.723 [Pipeline] httpRequest 00:00:11.132 [Pipeline] echo 00:00:11.134 Sorcerer 10.211.164.101 is alive 00:00:11.147 [Pipeline] retry 00:00:11.150 [Pipeline] { 00:00:11.166 [Pipeline] httpRequest 00:00:11.171 HttpMethod: GET 00:00:11.171 URL: http://10.211.164.101/packages/spdk_169c3cd047cec29b3b1e206c9259a77f3e6a8077.tar.gz 00:00:11.172 Sending request to url: http://10.211.164.101/packages/spdk_169c3cd047cec29b3b1e206c9259a77f3e6a8077.tar.gz 00:00:11.194 Response Code: HTTP/1.1 200 OK 00:00:11.194 Success: Status code 200 is in the accepted range: 200,404 00:00:11.194 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_169c3cd047cec29b3b1e206c9259a77f3e6a8077.tar.gz 00:01:49.537 [Pipeline] } 00:01:49.555 [Pipeline] // retry 00:01:49.562 [Pipeline] sh 00:01:49.849 + tar --no-same-owner -xf spdk_169c3cd047cec29b3b1e206c9259a77f3e6a8077.tar.gz 00:01:52.405 [Pipeline] sh 00:01:52.692 + git -C spdk log --oneline -n5 00:01:52.692 169c3cd04 thread: set SPDK_CONFIG_MAX_NUMA_NODES to 1 if not defined 00:01:52.692 cab1decc1 thread: add NUMA node support to spdk_iobuf_put() 00:01:52.692 40c9acf6d env: add spdk_mem_get_numa_id 00:01:52.692 0f99ab2fa thread: allocate iobuf memory based on numa_id 00:01:52.692 2ef611c19 thread: update all iobuf non-get/put functions for multiple NUMA nodes 00:01:52.710 [Pipeline] withCredentials 00:01:52.721 > git --version # timeout=10 00:01:52.736 > git --version # 'git version 2.39.2' 00:01:52.753 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:52.755 [Pipeline] { 00:01:52.766 [Pipeline] retry 00:01:52.768 [Pipeline] { 00:01:52.786 [Pipeline] sh 00:01:53.072 + git ls-remote http://dpdk.org/git/dpdk main 00:01:53.099 [Pipeline] } 00:01:53.119 [Pipeline] // retry 00:01:53.124 [Pipeline] } 00:01:53.141 [Pipeline] // withCredentials 00:01:53.152 [Pipeline] httpRequest 00:01:53.535 [Pipeline] echo 00:01:53.537 Sorcerer 10.211.164.101 is alive 00:01:53.548 [Pipeline] retry 00:01:53.550 [Pipeline] { 00:01:53.564 [Pipeline] httpRequest 00:01:53.569 HttpMethod: GET 00:01:53.570 URL: http://10.211.164.101/packages/dpdk_6dad0bb5c8621644beca86ff5f4910a943ba604d.tar.gz 00:01:53.570 Sending request to url: http://10.211.164.101/packages/dpdk_6dad0bb5c8621644beca86ff5f4910a943ba604d.tar.gz 00:01:53.571 Response Code: HTTP/1.1 200 OK 00:01:53.572 Success: Status code 200 is in the accepted range: 200,404 00:01:53.572 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_6dad0bb5c8621644beca86ff5f4910a943ba604d.tar.gz 00:01:59.374 [Pipeline] } 00:01:59.390 [Pipeline] // retry 00:01:59.397 [Pipeline] sh 00:01:59.683 + tar --no-same-owner -xf dpdk_6dad0bb5c8621644beca86ff5f4910a943ba604d.tar.gz 00:02:01.074 [Pipeline] sh 00:02:01.357 + git -C dpdk log --oneline -n5 00:02:01.357 6dad0bb5c8 event/cnxk: fix getwork write data on reconfig 00:02:01.357 b74f298f9b test/event: fix device stop 00:02:01.357 34e3ad3a1e eventdev: remove single event enqueue and dequeue 00:02:01.357 5079ede71e event/skeleton: remove single event enqueue and dequeue 00:02:01.357 a83fc0f4e1 event/cnxk: remove single event enqueue and dequeue 00:02:01.367 [Pipeline] } 00:02:01.380 [Pipeline] // stage 00:02:01.390 [Pipeline] stage 00:02:01.392 [Pipeline] { (Prepare) 00:02:01.408 [Pipeline] writeFile 00:02:01.422 [Pipeline] sh 00:02:01.704 + logger -p user.info -t JENKINS-CI 00:02:01.716 [Pipeline] sh 00:02:02.001 + logger -p user.info -t JENKINS-CI 00:02:02.013 [Pipeline] sh 00:02:02.295 + cat autorun-spdk.conf 00:02:02.295 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:02.295 SPDK_RUN_UBSAN=1 00:02:02.295 SPDK_TEST_FUZZER=1 00:02:02.295 SPDK_TEST_FUZZER_SHORT=1 00:02:02.295 SPDK_TEST_SETUP=1 00:02:02.295 SPDK_TEST_NATIVE_DPDK=main 00:02:02.295 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:02.301 RUN_NIGHTLY=1 00:02:02.305 [Pipeline] readFile 00:02:02.326 [Pipeline] withEnv 00:02:02.328 [Pipeline] { 00:02:02.338 [Pipeline] sh 00:02:02.622 + set -ex 00:02:02.622 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:02.622 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:02.622 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:02.622 ++ SPDK_RUN_UBSAN=1 00:02:02.622 ++ SPDK_TEST_FUZZER=1 00:02:02.622 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:02.622 ++ SPDK_TEST_SETUP=1 00:02:02.622 ++ SPDK_TEST_NATIVE_DPDK=main 00:02:02.622 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:02.622 ++ RUN_NIGHTLY=1 00:02:02.623 + case $SPDK_TEST_NVMF_NICS in 00:02:02.623 + DRIVERS= 00:02:02.623 + [[ -n '' ]] 00:02:02.623 + exit 0 00:02:02.631 [Pipeline] } 00:02:02.645 [Pipeline] // withEnv 00:02:02.649 [Pipeline] } 00:02:02.662 [Pipeline] // stage 00:02:02.670 [Pipeline] catchError 00:02:02.671 [Pipeline] { 00:02:02.684 [Pipeline] timeout 00:02:02.684 Timeout set to expire in 30 min 00:02:02.685 [Pipeline] { 00:02:02.698 [Pipeline] stage 00:02:02.699 [Pipeline] { (Tests) 00:02:02.712 [Pipeline] sh 00:02:02.995 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.995 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.995 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.995 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:02.995 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:02.995 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:02.995 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:02.995 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:02.995 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:02.995 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:02.995 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:02.995 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.995 + source /etc/os-release 00:02:02.995 ++ NAME='Fedora Linux' 00:02:02.995 ++ VERSION='39 (Cloud Edition)' 00:02:02.995 ++ ID=fedora 00:02:02.995 ++ VERSION_ID=39 00:02:02.995 ++ VERSION_CODENAME= 00:02:02.995 ++ PLATFORM_ID=platform:f39 00:02:02.995 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:02.995 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:02.995 ++ LOGO=fedora-logo-icon 00:02:02.995 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:02.995 ++ HOME_URL=https://fedoraproject.org/ 00:02:02.995 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:02.995 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:02.995 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:02.995 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:02.995 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:02.995 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:02.995 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:02.995 ++ SUPPORT_END=2024-11-12 00:02:02.995 ++ VARIANT='Cloud Edition' 00:02:02.995 ++ VARIANT_ID=cloud 00:02:02.995 + uname -a 00:02:02.995 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:02.995 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:06.286 Hugepages 00:02:06.286 node hugesize free / total 00:02:06.286 node0 1048576kB 0 / 0 00:02:06.286 node0 2048kB 0 / 0 00:02:06.286 node1 1048576kB 0 / 0 00:02:06.286 node1 2048kB 0 / 0 00:02:06.286 00:02:06.286 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:06.286 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:06.286 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:06.286 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:06.286 + rm -f /tmp/spdk-ld-path 00:02:06.286 + source autorun-spdk.conf 00:02:06.286 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:06.286 ++ SPDK_RUN_UBSAN=1 00:02:06.286 ++ SPDK_TEST_FUZZER=1 00:02:06.286 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:06.286 ++ SPDK_TEST_SETUP=1 00:02:06.286 ++ SPDK_TEST_NATIVE_DPDK=main 00:02:06.286 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.286 ++ RUN_NIGHTLY=1 00:02:06.286 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:06.286 + [[ -n '' ]] 00:02:06.286 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:06.286 + for M in /var/spdk/build-*-manifest.txt 00:02:06.286 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:06.286 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.286 + for M in /var/spdk/build-*-manifest.txt 00:02:06.286 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:06.286 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.286 + for M in /var/spdk/build-*-manifest.txt 00:02:06.286 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:06.286 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.286 ++ uname 00:02:06.286 + [[ Linux == \L\i\n\u\x ]] 00:02:06.286 + sudo dmesg -T 00:02:06.286 + sudo dmesg --clear 00:02:06.286 + dmesg_pid=3177714 00:02:06.286 + [[ Fedora Linux == FreeBSD ]] 00:02:06.286 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:06.286 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:06.286 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:06.286 + [[ -x /usr/src/fio-static/fio ]] 00:02:06.286 + export FIO_BIN=/usr/src/fio-static/fio 00:02:06.286 + FIO_BIN=/usr/src/fio-static/fio 00:02:06.286 + sudo dmesg -Tw 00:02:06.286 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:06.286 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:06.286 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:06.286 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:06.286 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:06.286 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:06.286 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:06.286 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:06.286 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:06.286 Test configuration: 00:02:06.286 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:06.286 SPDK_RUN_UBSAN=1 00:02:06.286 SPDK_TEST_FUZZER=1 00:02:06.286 SPDK_TEST_FUZZER_SHORT=1 00:02:06.286 SPDK_TEST_SETUP=1 00:02:06.286 SPDK_TEST_NATIVE_DPDK=main 00:02:06.286 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.286 RUN_NIGHTLY=1 21:27:07 -- common/autotest_common.sh@1688 -- $ [[ n == y ]] 00:02:06.286 21:27:07 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:06.286 21:27:07 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:06.286 21:27:07 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:06.286 21:27:07 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:06.286 21:27:07 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:06.286 21:27:07 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.286 21:27:07 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.286 21:27:07 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.286 21:27:07 -- paths/export.sh@5 -- $ export PATH 00:02:06.286 21:27:07 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.286 21:27:07 -- common/autobuild_common.sh@485 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:06.286 21:27:07 -- common/autobuild_common.sh@486 -- $ date +%s 00:02:06.286 21:27:07 -- common/autobuild_common.sh@486 -- $ mktemp -dt spdk_1730060827.XXXXXX 00:02:06.286 21:27:07 -- common/autobuild_common.sh@486 -- $ SPDK_WORKSPACE=/tmp/spdk_1730060827.RAb0k7 00:02:06.286 21:27:07 -- common/autobuild_common.sh@488 -- $ [[ -n '' ]] 00:02:06.286 21:27:07 -- common/autobuild_common.sh@492 -- $ '[' -n main ']' 00:02:06.286 21:27:07 -- common/autobuild_common.sh@493 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.286 21:27:07 -- common/autobuild_common.sh@493 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:06.286 21:27:07 -- common/autobuild_common.sh@499 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:06.286 21:27:07 -- common/autobuild_common.sh@501 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:06.286 21:27:07 -- common/autobuild_common.sh@502 -- $ get_config_params 00:02:06.286 21:27:07 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:02:06.286 21:27:07 -- common/autotest_common.sh@10 -- $ set +x 00:02:06.286 21:27:07 -- common/autobuild_common.sh@502 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:06.286 21:27:07 -- common/autobuild_common.sh@504 -- $ start_monitor_resources 00:02:06.286 21:27:07 -- pm/common@17 -- $ local monitor 00:02:06.286 21:27:07 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.286 21:27:07 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.286 21:27:07 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.286 21:27:07 -- pm/common@21 -- $ date +%s 00:02:06.286 21:27:07 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.286 21:27:07 -- pm/common@21 -- $ date +%s 00:02:06.286 21:27:07 -- pm/common@25 -- $ sleep 1 00:02:06.286 21:27:07 -- pm/common@21 -- $ date +%s 00:02:06.286 21:27:07 -- pm/common@21 -- $ date +%s 00:02:06.286 21:27:07 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1730060827 00:02:06.286 21:27:07 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1730060827 00:02:06.286 21:27:07 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1730060827 00:02:06.286 21:27:07 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1730060827 00:02:06.286 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1730060827_collect-vmstat.pm.log 00:02:06.287 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1730060827_collect-cpu-load.pm.log 00:02:06.287 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1730060827_collect-cpu-temp.pm.log 00:02:06.287 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1730060827_collect-bmc-pm.bmc.pm.log 00:02:07.226 21:27:08 -- common/autobuild_common.sh@505 -- $ trap stop_monitor_resources EXIT 00:02:07.226 21:27:08 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:07.226 21:27:08 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:07.226 21:27:08 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:07.226 21:27:08 -- spdk/autobuild.sh@16 -- $ date -u 00:02:07.226 Sun Oct 27 08:27:08 PM UTC 2024 00:02:07.226 21:27:08 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:07.226 v25.01-pre-118-g169c3cd04 00:02:07.226 21:27:08 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:07.226 21:27:08 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:07.226 21:27:08 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:07.226 21:27:08 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:02:07.226 21:27:08 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:07.226 21:27:08 -- common/autotest_common.sh@10 -- $ set +x 00:02:07.226 ************************************ 00:02:07.226 START TEST ubsan 00:02:07.226 ************************************ 00:02:07.226 21:27:08 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:02:07.226 using ubsan 00:02:07.226 00:02:07.226 real 0m0.001s 00:02:07.226 user 0m0.000s 00:02:07.226 sys 0m0.000s 00:02:07.226 21:27:08 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:07.226 21:27:08 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:07.226 ************************************ 00:02:07.226 END TEST ubsan 00:02:07.226 ************************************ 00:02:07.226 21:27:08 -- spdk/autobuild.sh@27 -- $ '[' -n main ']' 00:02:07.226 21:27:08 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:07.226 21:27:08 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:07.226 21:27:08 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:07.226 21:27:08 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:07.226 21:27:08 -- common/autotest_common.sh@10 -- $ set +x 00:02:07.487 ************************************ 00:02:07.487 START TEST build_native_dpdk 00:02:07.487 ************************************ 00:02:07.487 21:27:08 build_native_dpdk -- common/autotest_common.sh@1125 -- $ _build_native_dpdk 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:07.487 6dad0bb5c8 event/cnxk: fix getwork write data on reconfig 00:02:07.487 b74f298f9b test/event: fix device stop 00:02:07.487 34e3ad3a1e eventdev: remove single event enqueue and dequeue 00:02:07.487 5079ede71e event/skeleton: remove single event enqueue and dequeue 00:02:07.487 a83fc0f4e1 event/cnxk: remove single event enqueue and dequeue 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=24.11.0-rc1 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:02:07.487 21:27:08 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 24.11.0-rc1 21.11.0 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 24.11.0-rc1 '<' 21.11.0 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.487 21:27:08 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:07.487 21:27:09 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:02:07.487 patching file config/rte_config.h 00:02:07.487 Hunk #1 succeeded at 71 (offset 12 lines). 00:02:07.487 21:27:09 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 24.11.0-rc1 24.07.0 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 24.11.0-rc1 '<' 24.07.0 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v++ )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 11 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=11 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 11 =~ ^[0-9]+$ ]] 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 11 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=11 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 07 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=07 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 07 =~ ^[0-9]+$ ]] 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 7 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=7 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:07.487 21:27:09 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 24.11.0-rc1 24.07.0 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 24.11.0-rc1 '>=' 24.07.0 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:07.487 21:27:09 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v++ )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 11 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=11 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 11 =~ ^[0-9]+$ ]] 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 11 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=11 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 07 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@353 -- $ local d=07 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 07 =~ ^[0-9]+$ ]] 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@355 -- $ echo 7 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=7 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.488 21:27:09 build_native_dpdk -- scripts/common.sh@367 -- $ return 0 00:02:07.488 21:27:09 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:02:07.488 patching file drivers/bus/pci/linux/pci_uio.c 00:02:07.488 21:27:09 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:02:07.488 21:27:09 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:02:07.488 21:27:09 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:02:07.488 21:27:09 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:07.488 21:27:09 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:12.768 The Meson build system 00:02:12.768 Version: 1.5.0 00:02:12.768 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:12.768 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:12.768 Build type: native build 00:02:12.768 Project name: DPDK 00:02:12.768 Project version: 24.11.0-rc1 00:02:12.768 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:12.768 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:12.768 Host machine cpu family: x86_64 00:02:12.768 Host machine cpu: x86_64 00:02:12.768 Message: ## Building in Developer Mode ## 00:02:12.768 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:12.768 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:12.768 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:12.768 Program python3 (elftools) found: YES (/usr/bin/python3) modules: elftools 00:02:12.768 Program cat found: YES (/usr/bin/cat) 00:02:12.768 config/meson.build:119: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:12.768 Compiler for C supports arguments -march=native: YES 00:02:12.768 Checking for size of "void *" : 8 00:02:12.768 Checking for size of "void *" : 8 (cached) 00:02:12.768 Compiler for C supports link arguments -Wl,--undefined-version: YES 00:02:12.768 Library m found: YES 00:02:12.768 Library numa found: YES 00:02:12.768 Has header "numaif.h" : YES 00:02:12.768 Library fdt found: NO 00:02:12.768 Library execinfo found: NO 00:02:12.768 Has header "execinfo.h" : YES 00:02:12.768 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:12.768 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:12.768 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:12.768 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:12.768 Run-time dependency openssl found: YES 3.1.1 00:02:12.768 Run-time dependency libpcap found: YES 1.10.4 00:02:12.768 Has header "pcap.h" with dependency libpcap: YES 00:02:12.768 Compiler for C supports arguments -Wcast-qual: YES 00:02:12.768 Compiler for C supports arguments -Wdeprecated: YES 00:02:12.768 Compiler for C supports arguments -Wformat: YES 00:02:12.768 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:12.768 Compiler for C supports arguments -Wformat-security: NO 00:02:12.768 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:12.768 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:12.768 Compiler for C supports arguments -Wnested-externs: YES 00:02:12.768 Compiler for C supports arguments -Wold-style-definition: YES 00:02:12.768 Compiler for C supports arguments -Wpointer-arith: YES 00:02:12.769 Compiler for C supports arguments -Wsign-compare: YES 00:02:12.769 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:12.769 Compiler for C supports arguments -Wundef: YES 00:02:12.769 Compiler for C supports arguments -Wwrite-strings: YES 00:02:12.769 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:12.769 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:12.769 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:12.769 Program objdump found: YES (/usr/bin/objdump) 00:02:12.769 Compiler for C supports arguments -mavx512f -mavx512vl -mavx512dq -mavx512bw: YES 00:02:12.769 Checking if "AVX512 checking" compiles: YES 00:02:12.769 Fetching value of define "__AVX512F__" : 1 00:02:12.769 Fetching value of define "__AVX512BW__" : 1 00:02:12.769 Fetching value of define "__AVX512DQ__" : 1 00:02:12.769 Fetching value of define "__AVX512VL__" : 1 00:02:12.769 Fetching value of define "__SSE4_2__" : 1 00:02:12.769 Fetching value of define "__AES__" : 1 00:02:12.769 Fetching value of define "__AVX__" : 1 00:02:12.769 Fetching value of define "__AVX2__" : 1 00:02:12.769 Fetching value of define "__AVX512BW__" : 1 00:02:12.769 Fetching value of define "__AVX512CD__" : 1 00:02:12.769 Fetching value of define "__AVX512DQ__" : 1 00:02:12.769 Fetching value of define "__AVX512F__" : 1 00:02:12.769 Fetching value of define "__AVX512VL__" : 1 00:02:12.769 Fetching value of define "__PCLMUL__" : 1 00:02:12.769 Fetching value of define "__RDRND__" : 1 00:02:12.769 Fetching value of define "__RDSEED__" : 1 00:02:12.769 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:12.769 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:12.769 Message: lib/log: Defining dependency "log" 00:02:12.769 Message: lib/kvargs: Defining dependency "kvargs" 00:02:12.769 Message: lib/argparse: Defining dependency "argparse" 00:02:12.769 Message: lib/telemetry: Defining dependency "telemetry" 00:02:12.769 Checking for function "getentropy" : NO 00:02:12.769 Message: lib/eal: Defining dependency "eal" 00:02:12.769 Message: lib/ptr_compress: Defining dependency "ptr_compress" 00:02:12.769 Message: lib/ring: Defining dependency "ring" 00:02:12.769 Message: lib/rcu: Defining dependency "rcu" 00:02:12.769 Message: lib/mempool: Defining dependency "mempool" 00:02:12.769 Message: lib/mbuf: Defining dependency "mbuf" 00:02:12.769 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:12.769 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:12.769 Compiler for C supports arguments -mpclmul: YES 00:02:12.769 Compiler for C supports arguments -maes: YES 00:02:12.769 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:12.769 Message: lib/net: Defining dependency "net" 00:02:12.769 Message: lib/meter: Defining dependency "meter" 00:02:12.769 Message: lib/ethdev: Defining dependency "ethdev" 00:02:12.769 Message: lib/pci: Defining dependency "pci" 00:02:12.769 Message: lib/cmdline: Defining dependency "cmdline" 00:02:12.769 Message: lib/metrics: Defining dependency "metrics" 00:02:12.769 Message: lib/hash: Defining dependency "hash" 00:02:12.769 Message: lib/timer: Defining dependency "timer" 00:02:12.769 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:12.769 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:12.769 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:12.769 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:12.769 Message: lib/acl: Defining dependency "acl" 00:02:12.769 Message: lib/bbdev: Defining dependency "bbdev" 00:02:12.769 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:12.769 Run-time dependency libelf found: YES 0.191 00:02:12.769 Message: lib/bpf: Defining dependency "bpf" 00:02:12.769 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:12.769 Message: lib/compressdev: Defining dependency "compressdev" 00:02:12.769 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:12.769 Message: lib/distributor: Defining dependency "distributor" 00:02:12.769 Message: lib/dmadev: Defining dependency "dmadev" 00:02:12.769 Message: lib/efd: Defining dependency "efd" 00:02:12.769 Message: lib/eventdev: Defining dependency "eventdev" 00:02:12.769 Message: lib/dispatcher: Defining dependency "dispatcher" 00:02:12.769 Message: lib/gpudev: Defining dependency "gpudev" 00:02:12.769 Message: lib/gro: Defining dependency "gro" 00:02:12.769 Message: lib/gso: Defining dependency "gso" 00:02:12.769 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:12.769 Message: lib/jobstats: Defining dependency "jobstats" 00:02:12.769 Message: lib/latencystats: Defining dependency "latencystats" 00:02:12.769 Message: lib/lpm: Defining dependency "lpm" 00:02:12.769 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:12.769 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:12.769 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:12.769 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:12.769 Message: lib/member: Defining dependency "member" 00:02:12.769 Message: lib/pcapng: Defining dependency "pcapng" 00:02:12.769 Message: lib/power: Defining dependency "power" 00:02:12.769 Message: lib/rawdev: Defining dependency "rawdev" 00:02:12.769 Message: lib/regexdev: Defining dependency "regexdev" 00:02:12.769 Message: lib/mldev: Defining dependency "mldev" 00:02:12.769 Message: lib/rib: Defining dependency "rib" 00:02:12.769 Message: lib/reorder: Defining dependency "reorder" 00:02:12.769 Message: lib/sched: Defining dependency "sched" 00:02:12.769 Message: lib/security: Defining dependency "security" 00:02:12.769 Message: lib/stack: Defining dependency "stack" 00:02:12.769 Has header "linux/userfaultfd.h" : YES 00:02:12.769 Has header "linux/vduse.h" : YES 00:02:12.769 Message: lib/vhost: Defining dependency "vhost" 00:02:12.769 Message: lib/ipsec: Defining dependency "ipsec" 00:02:12.769 Message: lib/pdcp: Defining dependency "pdcp" 00:02:12.769 Message: lib/fib: Defining dependency "fib" 00:02:12.769 Message: lib/port: Defining dependency "port" 00:02:12.769 Message: lib/pdump: Defining dependency "pdump" 00:02:12.769 Message: lib/table: Defining dependency "table" 00:02:12.769 Message: lib/pipeline: Defining dependency "pipeline" 00:02:12.769 Message: lib/graph: Defining dependency "graph" 00:02:12.769 Message: lib/node: Defining dependency "node" 00:02:12.769 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:12.769 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:12.769 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:12.769 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:12.769 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:12.769 Compiler for C supports arguments -Wno-unused-value: YES 00:02:12.769 Compiler for C supports arguments -Wno-format: YES 00:02:12.769 Compiler for C supports arguments -Wno-format-security: YES 00:02:12.769 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:12.769 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:12.769 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:13.337 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:13.337 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:13.337 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:13.337 Has header "sys/epoll.h" : YES 00:02:13.337 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:13.337 Configuring doxy-api-html.conf using configuration 00:02:13.337 doc/api/meson.build:54: WARNING: The variable(s) 'DTS_API_MAIN_PAGE' in the input file 'doc/api/doxy-api.conf.in' are not present in the given configuration data. 00:02:13.337 Configuring doxy-api-man.conf using configuration 00:02:13.337 doc/api/meson.build:67: WARNING: The variable(s) 'DTS_API_MAIN_PAGE' in the input file 'doc/api/doxy-api.conf.in' are not present in the given configuration data. 00:02:13.337 Program mandb found: YES (/usr/bin/mandb) 00:02:13.337 Program sphinx-build found: NO 00:02:13.337 Program sphinx-build found: NO 00:02:13.337 Configuring rte_build_config.h using configuration 00:02:13.337 Message: 00:02:13.337 ================= 00:02:13.337 Applications Enabled 00:02:13.337 ================= 00:02:13.337 00:02:13.337 apps: 00:02:13.337 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:02:13.337 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:02:13.337 test-pmd, test-regex, test-sad, test-security-perf, 00:02:13.337 00:02:13.337 Message: 00:02:13.337 ================= 00:02:13.337 Libraries Enabled 00:02:13.337 ================= 00:02:13.337 00:02:13.337 libs: 00:02:13.337 log, kvargs, argparse, telemetry, eal, ptr_compress, ring, rcu, 00:02:13.337 mempool, mbuf, net, meter, ethdev, pci, cmdline, metrics, 00:02:13.337 hash, timer, acl, bbdev, bitratestats, bpf, cfgfile, compressdev, 00:02:13.337 cryptodev, distributor, dmadev, efd, eventdev, dispatcher, gpudev, gro, 00:02:13.337 gso, ip_frag, jobstats, latencystats, lpm, member, pcapng, power, 00:02:13.337 rawdev, regexdev, mldev, rib, reorder, sched, security, stack, 00:02:13.337 vhost, ipsec, pdcp, fib, port, pdump, table, pipeline, 00:02:13.337 graph, node, 00:02:13.337 00:02:13.337 Message: 00:02:13.337 =============== 00:02:13.337 Drivers Enabled 00:02:13.337 =============== 00:02:13.337 00:02:13.337 common: 00:02:13.337 00:02:13.337 bus: 00:02:13.337 pci, vdev, 00:02:13.337 mempool: 00:02:13.337 ring, 00:02:13.337 dma: 00:02:13.337 00:02:13.337 net: 00:02:13.337 i40e, 00:02:13.337 raw: 00:02:13.337 00:02:13.337 crypto: 00:02:13.337 00:02:13.337 compress: 00:02:13.337 00:02:13.337 regex: 00:02:13.337 00:02:13.337 ml: 00:02:13.337 00:02:13.337 vdpa: 00:02:13.337 00:02:13.337 event: 00:02:13.337 00:02:13.337 baseband: 00:02:13.337 00:02:13.337 gpu: 00:02:13.337 00:02:13.337 00:02:13.337 Message: 00:02:13.337 ================= 00:02:13.337 Content Skipped 00:02:13.337 ================= 00:02:13.337 00:02:13.337 apps: 00:02:13.337 00:02:13.337 libs: 00:02:13.337 00:02:13.337 drivers: 00:02:13.337 common/cpt: not in enabled drivers build config 00:02:13.337 common/dpaax: not in enabled drivers build config 00:02:13.337 common/iavf: not in enabled drivers build config 00:02:13.337 common/idpf: not in enabled drivers build config 00:02:13.337 common/ionic: not in enabled drivers build config 00:02:13.337 common/mvep: not in enabled drivers build config 00:02:13.337 common/octeontx: not in enabled drivers build config 00:02:13.337 bus/auxiliary: not in enabled drivers build config 00:02:13.337 bus/cdx: not in enabled drivers build config 00:02:13.337 bus/dpaa: not in enabled drivers build config 00:02:13.337 bus/fslmc: not in enabled drivers build config 00:02:13.337 bus/ifpga: not in enabled drivers build config 00:02:13.337 bus/platform: not in enabled drivers build config 00:02:13.337 bus/uacce: not in enabled drivers build config 00:02:13.337 bus/vmbus: not in enabled drivers build config 00:02:13.337 common/cnxk: not in enabled drivers build config 00:02:13.337 common/mlx5: not in enabled drivers build config 00:02:13.337 common/nfp: not in enabled drivers build config 00:02:13.337 common/nitrox: not in enabled drivers build config 00:02:13.337 common/qat: not in enabled drivers build config 00:02:13.337 common/sfc_efx: not in enabled drivers build config 00:02:13.337 mempool/bucket: not in enabled drivers build config 00:02:13.337 mempool/cnxk: not in enabled drivers build config 00:02:13.337 mempool/dpaa: not in enabled drivers build config 00:02:13.337 mempool/dpaa2: not in enabled drivers build config 00:02:13.337 mempool/octeontx: not in enabled drivers build config 00:02:13.337 mempool/stack: not in enabled drivers build config 00:02:13.337 dma/cnxk: not in enabled drivers build config 00:02:13.337 dma/dpaa: not in enabled drivers build config 00:02:13.337 dma/dpaa2: not in enabled drivers build config 00:02:13.337 dma/hisilicon: not in enabled drivers build config 00:02:13.337 dma/idxd: not in enabled drivers build config 00:02:13.337 dma/ioat: not in enabled drivers build config 00:02:13.337 dma/odm: not in enabled drivers build config 00:02:13.337 dma/skeleton: not in enabled drivers build config 00:02:13.337 net/af_packet: not in enabled drivers build config 00:02:13.337 net/af_xdp: not in enabled drivers build config 00:02:13.337 net/ark: not in enabled drivers build config 00:02:13.337 net/atlantic: not in enabled drivers build config 00:02:13.337 net/avp: not in enabled drivers build config 00:02:13.337 net/axgbe: not in enabled drivers build config 00:02:13.337 net/bnx2x: not in enabled drivers build config 00:02:13.337 net/bnxt: not in enabled drivers build config 00:02:13.337 net/bonding: not in enabled drivers build config 00:02:13.337 net/cnxk: not in enabled drivers build config 00:02:13.337 net/cpfl: not in enabled drivers build config 00:02:13.337 net/cxgbe: not in enabled drivers build config 00:02:13.337 net/dpaa: not in enabled drivers build config 00:02:13.337 net/dpaa2: not in enabled drivers build config 00:02:13.337 net/e1000: not in enabled drivers build config 00:02:13.337 net/ena: not in enabled drivers build config 00:02:13.337 net/enetc: not in enabled drivers build config 00:02:13.337 net/enetfec: not in enabled drivers build config 00:02:13.337 net/enic: not in enabled drivers build config 00:02:13.337 net/failsafe: not in enabled drivers build config 00:02:13.338 net/fm10k: not in enabled drivers build config 00:02:13.338 net/gve: not in enabled drivers build config 00:02:13.338 net/hinic: not in enabled drivers build config 00:02:13.338 net/hns3: not in enabled drivers build config 00:02:13.338 net/iavf: not in enabled drivers build config 00:02:13.338 net/ice: not in enabled drivers build config 00:02:13.338 net/idpf: not in enabled drivers build config 00:02:13.338 net/igc: not in enabled drivers build config 00:02:13.338 net/ionic: not in enabled drivers build config 00:02:13.338 net/ipn3ke: not in enabled drivers build config 00:02:13.338 net/ixgbe: not in enabled drivers build config 00:02:13.338 net/mana: not in enabled drivers build config 00:02:13.338 net/memif: not in enabled drivers build config 00:02:13.338 net/mlx4: not in enabled drivers build config 00:02:13.338 net/mlx5: not in enabled drivers build config 00:02:13.338 net/mvneta: not in enabled drivers build config 00:02:13.338 net/mvpp2: not in enabled drivers build config 00:02:13.338 net/netvsc: not in enabled drivers build config 00:02:13.338 net/nfb: not in enabled drivers build config 00:02:13.338 net/nfp: not in enabled drivers build config 00:02:13.338 net/ngbe: not in enabled drivers build config 00:02:13.338 net/ntnic: not in enabled drivers build config 00:02:13.338 net/null: not in enabled drivers build config 00:02:13.338 net/octeontx: not in enabled drivers build config 00:02:13.338 net/octeon_ep: not in enabled drivers build config 00:02:13.338 net/pcap: not in enabled drivers build config 00:02:13.338 net/pfe: not in enabled drivers build config 00:02:13.338 net/qede: not in enabled drivers build config 00:02:13.338 net/ring: not in enabled drivers build config 00:02:13.338 net/sfc: not in enabled drivers build config 00:02:13.338 net/softnic: not in enabled drivers build config 00:02:13.338 net/tap: not in enabled drivers build config 00:02:13.338 net/thunderx: not in enabled drivers build config 00:02:13.338 net/txgbe: not in enabled drivers build config 00:02:13.338 net/vdev_netvsc: not in enabled drivers build config 00:02:13.338 net/vhost: not in enabled drivers build config 00:02:13.338 net/virtio: not in enabled drivers build config 00:02:13.338 net/vmxnet3: not in enabled drivers build config 00:02:13.338 raw/cnxk_bphy: not in enabled drivers build config 00:02:13.338 raw/cnxk_gpio: not in enabled drivers build config 00:02:13.338 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:13.338 raw/ifpga: not in enabled drivers build config 00:02:13.338 raw/ntb: not in enabled drivers build config 00:02:13.338 raw/skeleton: not in enabled drivers build config 00:02:13.338 crypto/armv8: not in enabled drivers build config 00:02:13.338 crypto/bcmfs: not in enabled drivers build config 00:02:13.338 crypto/caam_jr: not in enabled drivers build config 00:02:13.338 crypto/ccp: not in enabled drivers build config 00:02:13.338 crypto/cnxk: not in enabled drivers build config 00:02:13.338 crypto/dpaa_sec: not in enabled drivers build config 00:02:13.338 crypto/dpaa2_sec: not in enabled drivers build config 00:02:13.338 crypto/ionic: not in enabled drivers build config 00:02:13.338 crypto/ipsec_mb: not in enabled drivers build config 00:02:13.338 crypto/mlx5: not in enabled drivers build config 00:02:13.338 crypto/mvsam: not in enabled drivers build config 00:02:13.338 crypto/nitrox: not in enabled drivers build config 00:02:13.338 crypto/null: not in enabled drivers build config 00:02:13.338 crypto/octeontx: not in enabled drivers build config 00:02:13.338 crypto/openssl: not in enabled drivers build config 00:02:13.338 crypto/scheduler: not in enabled drivers build config 00:02:13.338 crypto/uadk: not in enabled drivers build config 00:02:13.338 crypto/virtio: not in enabled drivers build config 00:02:13.338 compress/isal: not in enabled drivers build config 00:02:13.338 compress/mlx5: not in enabled drivers build config 00:02:13.338 compress/nitrox: not in enabled drivers build config 00:02:13.338 compress/octeontx: not in enabled drivers build config 00:02:13.338 compress/uadk: not in enabled drivers build config 00:02:13.338 compress/zlib: not in enabled drivers build config 00:02:13.338 regex/mlx5: not in enabled drivers build config 00:02:13.338 regex/cn9k: not in enabled drivers build config 00:02:13.338 ml/cnxk: not in enabled drivers build config 00:02:13.338 vdpa/ifc: not in enabled drivers build config 00:02:13.338 vdpa/mlx5: not in enabled drivers build config 00:02:13.338 vdpa/nfp: not in enabled drivers build config 00:02:13.338 vdpa/sfc: not in enabled drivers build config 00:02:13.338 event/cnxk: not in enabled drivers build config 00:02:13.338 event/dlb2: not in enabled drivers build config 00:02:13.338 event/dpaa: not in enabled drivers build config 00:02:13.338 event/dpaa2: not in enabled drivers build config 00:02:13.338 event/dsw: not in enabled drivers build config 00:02:13.338 event/opdl: not in enabled drivers build config 00:02:13.338 event/skeleton: not in enabled drivers build config 00:02:13.338 event/sw: not in enabled drivers build config 00:02:13.338 event/octeontx: not in enabled drivers build config 00:02:13.338 baseband/acc: not in enabled drivers build config 00:02:13.338 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:13.338 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:13.338 baseband/la12xx: not in enabled drivers build config 00:02:13.338 baseband/null: not in enabled drivers build config 00:02:13.338 baseband/turbo_sw: not in enabled drivers build config 00:02:13.338 gpu/cuda: not in enabled drivers build config 00:02:13.338 00:02:13.338 00:02:13.338 Build targets in project: 221 00:02:13.338 00:02:13.338 DPDK 24.11.0-rc1 00:02:13.338 00:02:13.338 User defined options 00:02:13.338 libdir : lib 00:02:13.338 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.338 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:13.338 c_link_args : 00:02:13.338 enable_docs : false 00:02:13.338 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:13.338 enable_kmods : false 00:02:13.338 machine : native 00:02:13.338 tests : false 00:02:13.338 00:02:13.338 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:13.338 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:13.618 21:27:15 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:13.618 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:13.618 [1/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:13.618 [2/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:13.882 [3/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:13.882 [4/721] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:13.882 [5/721] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:13.883 [6/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:13.883 [7/721] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:13.883 [8/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:13.883 [9/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:13.883 [10/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:13.883 [11/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:13.883 [12/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:13.883 [13/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:13.883 [14/721] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:13.883 [15/721] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:13.883 [16/721] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:13.883 [17/721] Linking static target lib/librte_kvargs.a 00:02:13.883 [18/721] Linking static target lib/librte_pci.a 00:02:13.883 [19/721] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:13.883 [20/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:13.883 [21/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:13.883 [22/721] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:13.883 [23/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:13.883 [24/721] Linking static target lib/librte_log.a 00:02:14.144 [25/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:14.144 [26/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:14.144 [27/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:14.144 [28/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:14.144 [29/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:14.144 [30/721] Compiling C object lib/librte_argparse.a.p/argparse_rte_argparse.c.o 00:02:14.144 [31/721] Linking static target lib/librte_argparse.a 00:02:14.144 [32/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:14.144 [33/721] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.406 [34/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:14.406 [35/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:14.406 [36/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:14.406 [37/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:14.406 [38/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_bitset.c.o 00:02:14.406 [39/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:14.406 [40/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:14.406 [41/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:14.406 [42/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:14.406 [43/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:14.406 [44/721] Compiling C object lib/librte_eal.a.p/eal_x86_rte_mmu.c.o 00:02:14.406 [45/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:14.406 [46/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:14.406 [47/721] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:14.406 [48/721] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.406 [49/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:14.406 [50/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:14.406 [51/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:14.406 [52/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:14.406 [53/721] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:14.406 [54/721] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:14.406 [55/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:14.406 [56/721] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:14.406 [57/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:14.406 [58/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:14.406 [59/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:14.406 [60/721] Linking static target lib/librte_meter.a 00:02:14.406 [61/721] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:14.406 [62/721] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:14.406 [63/721] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:14.406 [64/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:14.406 [65/721] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:14.406 [66/721] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:14.406 [67/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:14.406 [68/721] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:14.406 [69/721] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:14.406 [70/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:14.406 [71/721] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:14.406 [72/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:14.406 [73/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:14.406 [74/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:14.406 [75/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:14.406 [76/721] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:14.406 [77/721] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:14.406 [78/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:14.406 [79/721] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:14.406 [80/721] Generating lib/argparse.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.406 [81/721] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:14.406 [82/721] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:14.406 [83/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:14.406 [84/721] Linking static target lib/librte_cmdline.a 00:02:14.406 [85/721] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:14.406 [86/721] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:14.670 [87/721] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:14.670 [88/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:14.670 [89/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:14.670 [90/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:14.670 [91/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:14.670 [92/721] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:14.670 [93/721] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:14.670 [94/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:14.670 [95/721] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:14.670 [96/721] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:14.670 [97/721] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:14.670 [98/721] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:14.670 [99/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:14.670 [100/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:14.670 [101/721] Linking static target lib/librte_ring.a 00:02:14.670 [102/721] Linking static target lib/librte_metrics.a 00:02:14.670 [103/721] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:14.670 [104/721] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:14.670 [105/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:14.670 [106/721] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:14.670 [107/721] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:14.670 [108/721] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:14.670 [109/721] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:14.670 [110/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:14.670 [111/721] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:14.670 [112/721] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:14.670 [113/721] Linking static target lib/librte_net.a 00:02:14.670 [114/721] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:14.670 [115/721] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:14.670 [116/721] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:14.670 [117/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:14.670 [118/721] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:14.670 [119/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:14.670 [120/721] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:14.934 [121/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:14.934 [122/721] Linking static target lib/librte_cfgfile.a 00:02:14.934 [123/721] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:14.934 [124/721] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.934 [125/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:14.934 [126/721] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:14.934 [127/721] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:14.934 [128/721] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.934 [129/721] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:14.934 [130/721] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:14.934 [131/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:14.934 [132/721] Linking target lib/librte_log.so.25.0 00:02:14.934 [133/721] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:14.934 [134/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:14.934 [135/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:14.934 [136/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:14.934 [137/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:14.934 [138/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:14.934 [139/721] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:14.934 [140/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:14.934 [141/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:14.934 [142/721] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:14.934 [143/721] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.934 [144/721] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:14.934 [145/721] Linking static target lib/librte_timer.a 00:02:14.934 [146/721] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:15.223 [147/721] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:15.223 [148/721] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.223 [149/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:15.223 [150/721] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:15.223 [151/721] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:15.223 [152/721] Generating symbol file lib/librte_log.so.25.0.p/librte_log.so.25.0.symbols 00:02:15.223 [153/721] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:15.223 [154/721] Linking static target lib/librte_mempool.a 00:02:15.223 [155/721] Linking static target lib/librte_bitratestats.a 00:02:15.223 [156/721] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:15.223 [157/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:15.223 [158/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:15.223 [159/721] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:15.223 [160/721] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:15.223 [161/721] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:15.223 [162/721] Linking target lib/librte_kvargs.so.25.0 00:02:15.223 [163/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:15.223 [164/721] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:15.223 [165/721] Linking target lib/librte_argparse.so.25.0 00:02:15.223 [166/721] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.223 [167/721] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:15.223 [168/721] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:15.223 [169/721] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:15.223 [170/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:15.223 [171/721] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:15.223 [172/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:15.223 [173/721] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:15.223 [174/721] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:15.223 [175/721] Linking static target lib/librte_jobstats.a 00:02:15.223 [176/721] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:15.223 [177/721] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:15.223 [178/721] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:15.223 [179/721] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:15.223 [180/721] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:15.223 [181/721] Linking static target lib/librte_compressdev.a 00:02:15.223 [182/721] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:15.223 [183/721] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.223 [184/721] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:15.486 [185/721] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:15.486 [186/721] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:15.486 [187/721] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:15.486 [188/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:15.486 [189/721] Generating symbol file lib/librte_kvargs.so.25.0.p/librte_kvargs.so.25.0.symbols 00:02:15.486 [190/721] Compiling C object lib/librte_port.a.p/port_port_log.c.o 00:02:15.486 [191/721] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:02:15.487 [192/721] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:02:15.487 [193/721] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:02:15.487 [194/721] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:15.487 [195/721] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:15.487 [196/721] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:02:15.487 [197/721] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:15.487 [198/721] Linking static target lib/librte_rcu.a 00:02:15.487 [199/721] Linking static target lib/librte_dispatcher.a 00:02:15.487 [200/721] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:15.487 [201/721] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:15.487 [202/721] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:15.487 [203/721] Linking static target lib/librte_latencystats.a 00:02:15.487 [204/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:15.487 [205/721] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.487 [206/721] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:15.487 [207/721] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:15.487 [208/721] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:15.487 [209/721] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:15.487 [210/721] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:15.487 [211/721] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:15.487 [212/721] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:15.487 [213/721] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:15.487 [214/721] Linking static target lib/librte_bbdev.a 00:02:15.487 [215/721] Linking static target lib/librte_gpudev.a 00:02:15.487 [216/721] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:15.487 [217/721] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:15.487 [218/721] Linking static target lib/librte_telemetry.a 00:02:15.487 [219/721] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:15.487 [220/721] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:15.487 [221/721] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:02:15.487 [222/721] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:15.487 [223/721] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:15.487 [224/721] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:15.487 [225/721] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:15.487 [226/721] Linking static target lib/librte_gro.a 00:02:15.487 [227/721] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:15.487 [228/721] Linking static target lib/librte_gso.a 00:02:15.487 [229/721] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:15.487 [230/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:15.487 [231/721] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:15.755 [232/721] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.755 [233/721] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:15.755 [234/721] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:15.755 [235/721] Linking static target lib/librte_eal.a 00:02:15.755 [236/721] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:15.755 [237/721] Linking static target lib/librte_stack.a 00:02:15.755 [238/721] Linking static target lib/librte_dmadev.a 00:02:15.755 [239/721] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:15.755 [240/721] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:15.755 [241/721] Linking static target lib/librte_distributor.a 00:02:15.755 [242/721] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:15.755 [243/721] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:15.755 [244/721] Compiling C object lib/librte_table.a.p/table_table_log.c.o 00:02:15.755 [245/721] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:02:15.755 [246/721] Linking static target lib/librte_regexdev.a 00:02:15.755 [247/721] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:15.755 [248/721] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:15.755 [249/721] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.755 [250/721] Linking static target lib/librte_ip_frag.a 00:02:15.755 [251/721] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:15.755 [252/721] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:15.755 [253/721] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:15.755 [254/721] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:15.755 [255/721] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:15.755 [256/721] Linking static target lib/librte_rawdev.a 00:02:15.755 [257/721] Linking static target lib/librte_mbuf.a 00:02:15.755 [258/721] Linking static target lib/librte_power.a 00:02:15.755 [259/721] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:15.755 [260/721] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:02:15.755 [261/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:15.755 [262/721] Linking static target lib/librte_pcapng.a 00:02:15.755 [263/721] Linking static target lib/librte_mldev.a 00:02:16.018 [264/721] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.018 [265/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:02:16.018 [266/721] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.018 [267/721] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:16.018 [268/721] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:02:16.018 [269/721] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:16.018 [270/721] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:02:16.018 [271/721] Linking static target lib/librte_reorder.a 00:02:16.018 [272/721] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.018 [273/721] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:16.018 [274/721] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:02:16.019 [275/721] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.019 [276/721] Linking static target lib/librte_rib.a 00:02:16.019 [277/721] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:16.019 [278/721] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:16.019 [279/721] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:16.019 [280/721] Linking static target lib/librte_security.a 00:02:16.019 [281/721] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.019 [282/721] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:16.019 [283/721] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:16.019 [284/721] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.019 [285/721] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:16.019 [286/721] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:16.019 [287/721] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:16.019 [288/721] Linking static target lib/librte_lpm.a 00:02:16.019 [289/721] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:16.019 [290/721] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:16.019 [291/721] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:02:16.019 [292/721] Linking static target lib/librte_bpf.a 00:02:16.019 [293/721] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [294/721] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:16.283 [295/721] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:16.283 [296/721] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:16.283 [297/721] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [298/721] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:16.283 [299/721] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [300/721] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:16.283 [301/721] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:16.283 [302/721] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:16.283 [303/721] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:02:16.283 [304/721] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:16.283 [305/721] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [306/721] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [307/721] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [308/721] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:16.283 [309/721] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:16.283 [310/721] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.283 [311/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:16.283 [312/721] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:16.283 [313/721] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:16.541 [314/721] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:16.541 [315/721] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.541 [316/721] Linking target lib/librte_telemetry.so.25.0 00:02:16.541 [317/721] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:16.541 [318/721] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:16.541 [319/721] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:16.541 [320/721] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:16.541 [321/721] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:16.541 [322/721] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:16.541 [323/721] Linking static target lib/librte_efd.a 00:02:16.541 [324/721] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.541 [325/721] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:16.541 [326/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:16.541 [327/721] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:16.541 [328/721] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.541 [329/721] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:02:16.541 [330/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:16.541 [331/721] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:16.541 [332/721] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:16.541 [333/721] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:16.541 [334/721] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.541 [335/721] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:16.541 [336/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:16.541 [337/721] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:16.541 [338/721] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:16.542 [339/721] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.542 [340/721] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:16.542 [341/721] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:16.542 [342/721] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:16.542 [343/721] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:16.542 [344/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:16.542 [345/721] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.803 [346/721] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:16.803 [347/721] Generating symbol file lib/librte_telemetry.so.25.0.p/librte_telemetry.so.25.0.symbols 00:02:16.803 [348/721] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:16.803 [349/721] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:16.803 [350/721] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.803 [351/721] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.803 [352/721] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:16.803 [353/721] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:16.803 [354/721] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:16.803 [355/721] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:16.803 [356/721] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:16.803 [357/721] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:16.804 [358/721] Generating app/graph/commands_hdr with a custom command (wrapped by meson to capture output) 00:02:16.804 [359/721] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:02:16.804 [360/721] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:16.804 [361/721] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:16.804 [362/721] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.804 [363/721] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:16.804 [364/721] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:16.804 [365/721] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.064 [366/721] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:17.064 [367/721] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.064 [368/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:17.064 [369/721] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:17.064 [370/721] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:17.064 [371/721] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:17.064 [372/721] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:02:17.064 [373/721] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:17.064 [374/721] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:17.064 [375/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:17.064 [376/721] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.064 [377/721] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:17.064 [378/721] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:17.064 [379/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:17.064 [380/721] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.064 [381/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:17.064 [382/721] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:17.064 [383/721] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:02:17.064 [384/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:17.064 [385/721] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:17.064 [386/721] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:17.064 [387/721] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:17.064 [388/721] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:02:17.331 [389/721] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:17.331 [390/721] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:17.331 [391/721] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:17.331 [392/721] Linking static target lib/librte_pdump.a 00:02:17.331 [393/721] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:02:17.331 [394/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:17.331 [395/721] Linking static target lib/librte_graph.a 00:02:17.331 [396/721] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:02:17.331 [397/721] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:17.331 [398/721] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:17.331 [399/721] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:17.331 [400/721] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:17.331 [401/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:17.331 [402/721] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:17.594 [403/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:02:17.594 [404/721] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:02:17.594 [405/721] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:17.594 [406/721] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:17.594 [407/721] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:17.594 [408/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:17.594 [409/721] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:02:17.594 [410/721] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:17.594 [411/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:17.594 [412/721] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:17.594 [413/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:17.594 [414/721] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:17.594 [415/721] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:02:17.594 [416/721] Linking static target lib/librte_sched.a 00:02:17.594 [417/721] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:17.594 [418/721] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:17.594 [419/721] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:17.594 [420/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:17.594 [421/721] Compiling C object drivers/librte_bus_vdev.so.25.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:17.594 [422/721] Linking static target drivers/librte_bus_vdev.a 00:02:17.594 [423/721] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:02:17.594 [424/721] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:02:17.594 [425/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:17.594 [426/721] Linking static target lib/librte_table.a 00:02:17.594 [427/721] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:02:17.594 [428/721] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:02:17.594 [429/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:02:17.594 [430/721] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.594 [431/721] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:02:17.594 [432/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:02:17.594 [433/721] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:17.860 [434/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:17.860 [435/721] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:02:17.860 [436/721] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:17.860 [437/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:17.860 [438/721] Compiling C object app/dpdk-graph.p/graph_l2fwd.c.o 00:02:17.860 [439/721] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:17.860 [440/721] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:02:17.860 [441/721] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:17.860 [442/721] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:02:17.860 [443/721] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:02:17.860 [444/721] Linking static target lib/librte_cryptodev.a 00:02:17.860 [445/721] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:02:17.860 [446/721] Linking static target lib/librte_fib.a 00:02:17.860 [447/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:17.860 [448/721] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:17.861 [449/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:17.861 [450/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:02:17.861 [451/721] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:17.861 [452/721] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:17.861 [453/721] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:02:17.861 [454/721] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:17.861 [455/721] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:02:17.861 [456/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:18.124 [457/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:18.124 [458/721] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:18.125 [459/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:18.125 [460/721] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:18.125 [461/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:18.125 [462/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:02:18.125 [463/721] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:18.125 [464/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:18.125 [465/721] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:18.125 [466/721] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.125 [467/721] Compiling C object drivers/librte_bus_pci.so.25.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.125 [468/721] Linking static target lib/librte_member.a 00:02:18.125 [469/721] Linking static target drivers/librte_bus_pci.a 00:02:18.125 [470/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:18.125 [471/721] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:18.125 [472/721] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:18.125 [473/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:18.125 [474/721] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.125 [475/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:18.125 [476/721] Linking static target lib/librte_pdcp.a 00:02:18.125 [477/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:18.125 [478/721] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:18.125 [479/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:18.125 [480/721] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:18.125 [481/721] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:18.125 [482/721] Linking static target lib/librte_hash.a 00:02:18.125 [483/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:18.125 [484/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:18.125 [485/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:18.125 [486/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:18.125 [487/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:18.125 [488/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:18.125 [489/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:18.125 [490/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:18.386 [491/721] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.386 [492/721] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:18.386 [493/721] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:18.386 [494/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:18.386 [495/721] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:18.386 [496/721] Linking static target lib/librte_node.a 00:02:18.386 [497/721] Linking static target lib/librte_ipsec.a 00:02:18.386 [498/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:18.386 [499/721] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:18.386 [500/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:18.386 [501/721] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:18.386 [502/721] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:18.386 [503/721] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.386 [504/721] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.386 [505/721] Compiling C object drivers/librte_mempool_ring.so.25.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.386 [506/721] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:18.386 [507/721] Linking static target drivers/librte_mempool_ring.a 00:02:18.386 [508/721] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:18.386 [509/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:18.386 [510/721] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.386 [511/721] Linking static target lib/acl/libavx2_tmp.a 00:02:18.386 [512/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:18.386 [513/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:18.386 [514/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:18.386 [515/721] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:18.386 [516/721] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:18.386 [517/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:18.386 [518/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:18.386 [519/721] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.386 [520/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:18.386 [521/721] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:18.386 [522/721] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.386 [523/721] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:18.386 [524/721] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:18.386 [525/721] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:18.386 [526/721] Linking static target lib/librte_port.a 00:02:18.644 [527/721] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.644 [528/721] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:18.644 [529/721] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:18.644 [530/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:18.644 [531/721] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:18.644 [532/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:18.644 [533/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:18.644 [534/721] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.644 [535/721] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:18.644 [536/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:18.644 [537/721] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:18.644 [538/721] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:18.644 [539/721] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.644 [540/721] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:18.644 [541/721] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.644 [542/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:18.644 [543/721] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.644 [544/721] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:18.644 [545/721] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:18.644 [546/721] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:18.644 [547/721] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:18.644 [548/721] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:18.644 [549/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:18.644 [550/721] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:18.644 [551/721] Linking static target lib/librte_eventdev.a 00:02:18.644 [552/721] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:18.644 [553/721] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:18.644 [554/721] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:18.903 [555/721] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:18.903 [556/721] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:18.903 [557/721] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:18.903 [558/721] Linking static target lib/librte_acl.a 00:02:18.903 [559/721] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:18.903 [560/721] Compiling C object app/dpdk-test-security-perf.p/test_test_security_proto.c.o 00:02:18.903 [561/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:18.903 [562/721] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:18.903 [563/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:18.903 [564/721] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:18.903 [565/721] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:18.903 [566/721] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:18.903 [567/721] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:18.903 [568/721] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:18.903 [569/721] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:18.903 [570/721] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.903 [571/721] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:18.903 [572/721] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:18.903 [573/721] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:18.903 [574/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:19.162 [575/721] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:19.162 [576/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:19.162 [577/721] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:19.162 [578/721] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:19.162 [579/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:19.162 [580/721] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.162 [581/721] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.162 [582/721] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:19.421 [583/721] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:19.421 [584/721] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:19.421 [585/721] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:19.421 [586/721] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:19.421 [587/721] Linking static target lib/librte_ethdev.a 00:02:19.680 [588/721] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.680 [589/721] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:19.939 [590/721] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:19.939 [591/721] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:20.506 [592/721] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:20.506 [593/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:20.764 [594/721] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:21.022 [595/721] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:21.022 [596/721] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:21.280 [597/721] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:21.280 [598/721] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:21.280 [599/721] Compiling C object drivers/librte_net_i40e.so.25.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:21.280 [600/721] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:21.539 [601/721] Linking static target drivers/librte_net_i40e.a 00:02:21.797 [602/721] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:22.363 [603/721] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.622 [604/721] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.622 [605/721] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:22.622 [606/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:29.184 [607/721] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.184 [608/721] Linking target lib/librte_eal.so.25.0 00:02:29.184 [609/721] Generating symbol file lib/librte_eal.so.25.0.p/librte_eal.so.25.0.symbols 00:02:29.184 [610/721] Linking target lib/librte_dmadev.so.25.0 00:02:29.184 [611/721] Linking target lib/librte_timer.so.25.0 00:02:29.184 [612/721] Linking target lib/librte_stack.so.25.0 00:02:29.184 [613/721] Linking target lib/librte_rawdev.so.25.0 00:02:29.184 [614/721] Linking target lib/librte_pci.so.25.0 00:02:29.184 [615/721] Linking target lib/librte_ring.so.25.0 00:02:29.184 [616/721] Linking target lib/librte_meter.so.25.0 00:02:29.184 [617/721] Linking target lib/librte_jobstats.so.25.0 00:02:29.184 [618/721] Linking target lib/librte_cfgfile.so.25.0 00:02:29.184 [619/721] Linking target drivers/librte_bus_vdev.so.25.0 00:02:29.184 [620/721] Linking target lib/librte_acl.so.25.0 00:02:29.184 [621/721] Generating symbol file lib/librte_dmadev.so.25.0.p/librte_dmadev.so.25.0.symbols 00:02:29.184 [622/721] Generating symbol file lib/librte_timer.so.25.0.p/librte_timer.so.25.0.symbols 00:02:29.184 [623/721] Generating symbol file lib/librte_ring.so.25.0.p/librte_ring.so.25.0.symbols 00:02:29.184 [624/721] Generating symbol file lib/librte_meter.so.25.0.p/librte_meter.so.25.0.symbols 00:02:29.184 [625/721] Generating symbol file lib/librte_pci.so.25.0.p/librte_pci.so.25.0.symbols 00:02:29.184 [626/721] Generating symbol file drivers/librte_bus_vdev.so.25.0.p/librte_bus_vdev.so.25.0.symbols 00:02:29.184 [627/721] Generating symbol file lib/librte_acl.so.25.0.p/librte_acl.so.25.0.symbols 00:02:29.184 [628/721] Linking target lib/librte_rcu.so.25.0 00:02:29.184 [629/721] Linking target drivers/librte_bus_pci.so.25.0 00:02:29.184 [630/721] Linking target lib/librte_mempool.so.25.0 00:02:29.184 [631/721] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:29.184 [632/721] Linking static target lib/librte_pipeline.a 00:02:29.184 [633/721] Generating symbol file lib/librte_rcu.so.25.0.p/librte_rcu.so.25.0.symbols 00:02:29.184 [634/721] Generating symbol file drivers/librte_bus_pci.so.25.0.p/librte_bus_pci.so.25.0.symbols 00:02:29.184 [635/721] Generating symbol file lib/librte_mempool.so.25.0.p/librte_mempool.so.25.0.symbols 00:02:29.184 [636/721] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.184 [637/721] Linking target lib/librte_mbuf.so.25.0 00:02:29.184 [638/721] Linking target drivers/librte_mempool_ring.so.25.0 00:02:29.184 [639/721] Generating symbol file lib/librte_mbuf.so.25.0.p/librte_mbuf.so.25.0.symbols 00:02:29.184 [640/721] Linking target lib/librte_distributor.so.25.0 00:02:29.184 [641/721] Linking target lib/librte_reorder.so.25.0 00:02:29.184 [642/721] Linking target lib/librte_mldev.so.25.0 00:02:29.184 [643/721] Linking target lib/librte_regexdev.so.25.0 00:02:29.184 [644/721] Linking target lib/librte_cryptodev.so.25.0 00:02:29.184 [645/721] Linking target lib/librte_compressdev.so.25.0 00:02:29.184 [646/721] Linking target lib/librte_bbdev.so.25.0 00:02:29.184 [647/721] Linking target lib/librte_net.so.25.0 00:02:29.184 [648/721] Linking target lib/librte_gpudev.so.25.0 00:02:29.184 [649/721] Linking target lib/librte_sched.so.25.0 00:02:29.184 [650/721] Generating symbol file lib/librte_cryptodev.so.25.0.p/librte_cryptodev.so.25.0.symbols 00:02:29.184 [651/721] Generating symbol file lib/librte_sched.so.25.0.p/librte_sched.so.25.0.symbols 00:02:29.184 [652/721] Generating symbol file lib/librte_reorder.so.25.0.p/librte_reorder.so.25.0.symbols 00:02:29.184 [653/721] Generating symbol file lib/librte_net.so.25.0.p/librte_net.so.25.0.symbols 00:02:29.184 [654/721] Linking target lib/librte_security.so.25.0 00:02:29.184 [655/721] Linking target lib/librte_rib.so.25.0 00:02:29.184 [656/721] Linking target lib/librte_cmdline.so.25.0 00:02:29.184 [657/721] Linking target lib/librte_hash.so.25.0 00:02:29.184 [658/721] Linking target lib/librte_ethdev.so.25.0 00:02:29.184 [659/721] Generating symbol file lib/librte_security.so.25.0.p/librte_security.so.25.0.symbols 00:02:29.184 [660/721] Generating symbol file lib/librte_rib.so.25.0.p/librte_rib.so.25.0.symbols 00:02:29.184 [661/721] Generating symbol file lib/librte_hash.so.25.0.p/librte_hash.so.25.0.symbols 00:02:29.184 [662/721] Generating symbol file lib/librte_ethdev.so.25.0.p/librte_ethdev.so.25.0.symbols 00:02:29.184 [663/721] Linking target lib/librte_pdcp.so.25.0 00:02:29.184 [664/721] Linking target lib/librte_member.so.25.0 00:02:29.184 [665/721] Linking target lib/librte_lpm.so.25.0 00:02:29.184 [666/721] Linking target lib/librte_efd.so.25.0 00:02:29.184 [667/721] Linking target lib/librte_fib.so.25.0 00:02:29.184 [668/721] Linking target lib/librte_ipsec.so.25.0 00:02:29.184 [669/721] Linking target lib/librte_pcapng.so.25.0 00:02:29.184 [670/721] Linking target lib/librte_gro.so.25.0 00:02:29.184 [671/721] Linking target lib/librte_metrics.so.25.0 00:02:29.184 [672/721] Linking target lib/librte_gso.so.25.0 00:02:29.184 [673/721] Linking target lib/librte_ip_frag.so.25.0 00:02:29.184 [674/721] Linking target lib/librte_bpf.so.25.0 00:02:29.184 [675/721] Linking target lib/librte_power.so.25.0 00:02:29.185 [676/721] Linking target lib/librte_eventdev.so.25.0 00:02:29.185 [677/721] Linking target drivers/librte_net_i40e.so.25.0 00:02:29.443 [678/721] Generating symbol file lib/librte_lpm.so.25.0.p/librte_lpm.so.25.0.symbols 00:02:29.443 [679/721] Generating symbol file lib/librte_metrics.so.25.0.p/librte_metrics.so.25.0.symbols 00:02:29.443 [680/721] Generating symbol file lib/librte_ip_frag.so.25.0.p/librte_ip_frag.so.25.0.symbols 00:02:29.443 [681/721] Generating symbol file lib/librte_pcapng.so.25.0.p/librte_pcapng.so.25.0.symbols 00:02:29.443 [682/721] Generating symbol file lib/librte_ipsec.so.25.0.p/librte_ipsec.so.25.0.symbols 00:02:29.443 [683/721] Generating symbol file lib/librte_bpf.so.25.0.p/librte_bpf.so.25.0.symbols 00:02:29.443 [684/721] Linking target lib/librte_bitratestats.so.25.0 00:02:29.443 [685/721] Generating symbol file lib/librte_eventdev.so.25.0.p/librte_eventdev.so.25.0.symbols 00:02:29.443 [686/721] Linking target lib/librte_latencystats.so.25.0 00:02:29.443 [687/721] Linking target lib/librte_pdump.so.25.0 00:02:29.443 [688/721] Linking target lib/librte_graph.so.25.0 00:02:29.443 [689/721] Linking target lib/librte_dispatcher.so.25.0 00:02:29.443 [690/721] Linking target lib/librte_port.so.25.0 00:02:29.443 [691/721] Generating symbol file lib/librte_graph.so.25.0.p/librte_graph.so.25.0.symbols 00:02:29.701 [692/721] Linking target lib/librte_node.so.25.0 00:02:29.701 [693/721] Generating symbol file lib/librte_port.so.25.0.p/librte_port.so.25.0.symbols 00:02:29.701 [694/721] Linking target lib/librte_table.so.25.0 00:02:29.701 [695/721] Generating symbol file lib/librte_table.so.25.0.p/librte_table.so.25.0.symbols 00:02:29.960 [696/721] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:29.960 [697/721] Linking static target lib/librte_vhost.a 00:02:30.219 [698/721] Linking target app/dpdk-test-cmdline 00:02:30.219 [699/721] Linking target app/dpdk-test-compress-perf 00:02:30.219 [700/721] Linking target app/dpdk-test-dma-perf 00:02:30.219 [701/721] Linking target app/dpdk-graph 00:02:30.219 [702/721] Linking target app/dpdk-test-fib 00:02:30.219 [703/721] Linking target app/dpdk-pdump 00:02:30.479 [704/721] Linking target app/dpdk-test-regex 00:02:30.479 [705/721] Linking target app/dpdk-proc-info 00:02:30.479 [706/721] Linking target app/dpdk-test-acl 00:02:30.479 [707/721] Linking target app/dpdk-test-crypto-perf 00:02:30.479 [708/721] Linking target app/dpdk-test-flow-perf 00:02:30.479 [709/721] Linking target app/dpdk-test-security-perf 00:02:30.479 [710/721] Linking target app/dpdk-test-bbdev 00:02:30.479 [711/721] Linking target app/dpdk-test-sad 00:02:30.479 [712/721] Linking target app/dpdk-dumpcap 00:02:30.479 [713/721] Linking target app/dpdk-test-gpudev 00:02:30.479 [714/721] Linking target app/dpdk-test-mldev 00:02:30.479 [715/721] Linking target app/dpdk-test-pipeline 00:02:30.479 [716/721] Linking target app/dpdk-test-eventdev 00:02:30.479 [717/721] Linking target app/dpdk-testpmd 00:02:32.384 [718/721] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.384 [719/721] Linking target lib/librte_vhost.so.25.0 00:02:34.289 [720/721] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.289 [721/721] Linking target lib/librte_pipeline.so.25.0 00:02:34.289 21:27:36 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:34.549 21:27:36 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:34.549 21:27:36 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:34.549 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:34.549 [0/1] Installing files. 00:02:34.812 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/memory.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/cpu.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/counters.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:34.812 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.812 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_eddsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.813 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:34.814 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.815 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.816 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:34.817 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:34.818 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:34.818 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:34.818 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_log.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_kvargs.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_argparse.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_argparse.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_telemetry.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_eal.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_rcu.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_mempool.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_mbuf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_net.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_meter.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_ethdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_cmdline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_metrics.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_hash.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_timer.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_acl.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_bbdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_bitratestats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_bpf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_cfgfile.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_compressdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_cryptodev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_distributor.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_dmadev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_efd.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.818 Installing lib/librte_eventdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_dispatcher.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_gpudev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_gro.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_gso.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_ip_frag.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_jobstats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_latencystats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_lpm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_member.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pcapng.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_power.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_rawdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_regexdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_mldev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_rib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_reorder.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_sched.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_security.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_stack.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_vhost.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_ipsec.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pdcp.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_fib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_port.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pdump.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_table.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_pipeline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_graph.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing lib/librte_node.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing drivers/librte_bus_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:35.081 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing drivers/librte_bus_vdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:35.081 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing drivers/librte_mempool_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:35.081 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.081 Installing drivers/librte_net_i40e.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:35.082 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/argparse/rte_argparse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitset.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.082 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ptr_compress/rte_ptr_compress.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_cksum.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip4.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.083 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.084 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.085 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry-exporter.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:35.086 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:35.086 Installing symlink pointing to librte_log.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.25 00:02:35.086 Installing symlink pointing to librte_log.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:35.086 Installing symlink pointing to librte_kvargs.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.25 00:02:35.086 Installing symlink pointing to librte_kvargs.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:35.086 Installing symlink pointing to librte_argparse.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so.25 00:02:35.086 Installing symlink pointing to librte_argparse.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so 00:02:35.086 Installing symlink pointing to librte_telemetry.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.25 00:02:35.086 Installing symlink pointing to librte_telemetry.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:35.086 Installing symlink pointing to librte_eal.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.25 00:02:35.086 Installing symlink pointing to librte_eal.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:35.086 Installing symlink pointing to librte_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.25 00:02:35.086 Installing symlink pointing to librte_ring.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:35.086 Installing symlink pointing to librte_rcu.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.25 00:02:35.086 Installing symlink pointing to librte_rcu.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:35.086 Installing symlink pointing to librte_mempool.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.25 00:02:35.086 Installing symlink pointing to librte_mempool.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:35.086 Installing symlink pointing to librte_mbuf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.25 00:02:35.086 Installing symlink pointing to librte_mbuf.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:35.086 Installing symlink pointing to librte_net.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.25 00:02:35.086 Installing symlink pointing to librte_net.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:35.086 Installing symlink pointing to librte_meter.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.25 00:02:35.086 Installing symlink pointing to librte_meter.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:35.086 Installing symlink pointing to librte_ethdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.25 00:02:35.086 Installing symlink pointing to librte_ethdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:35.086 Installing symlink pointing to librte_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.25 00:02:35.086 Installing symlink pointing to librte_pci.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:35.086 Installing symlink pointing to librte_cmdline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.25 00:02:35.086 Installing symlink pointing to librte_cmdline.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:35.086 Installing symlink pointing to librte_metrics.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.25 00:02:35.086 Installing symlink pointing to librte_metrics.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:35.086 Installing symlink pointing to librte_hash.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.25 00:02:35.086 Installing symlink pointing to librte_hash.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:35.086 Installing symlink pointing to librte_timer.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.25 00:02:35.086 Installing symlink pointing to librte_timer.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:35.086 Installing symlink pointing to librte_acl.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.25 00:02:35.086 Installing symlink pointing to librte_acl.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:35.086 Installing symlink pointing to librte_bbdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.25 00:02:35.086 Installing symlink pointing to librte_bbdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:35.086 Installing symlink pointing to librte_bitratestats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.25 00:02:35.086 Installing symlink pointing to librte_bitratestats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:35.086 Installing symlink pointing to librte_bpf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.25 00:02:35.086 Installing symlink pointing to librte_bpf.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:35.086 Installing symlink pointing to librte_cfgfile.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.25 00:02:35.086 Installing symlink pointing to librte_cfgfile.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:35.086 Installing symlink pointing to librte_compressdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.25 00:02:35.086 Installing symlink pointing to librte_compressdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:35.086 Installing symlink pointing to librte_cryptodev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.25 00:02:35.086 Installing symlink pointing to librte_cryptodev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:35.086 Installing symlink pointing to librte_distributor.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.25 00:02:35.086 Installing symlink pointing to librte_distributor.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:35.086 Installing symlink pointing to librte_dmadev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.25 00:02:35.086 Installing symlink pointing to librte_dmadev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:35.086 Installing symlink pointing to librte_efd.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.25 00:02:35.086 Installing symlink pointing to librte_efd.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:35.086 Installing symlink pointing to librte_eventdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.25 00:02:35.086 Installing symlink pointing to librte_eventdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:35.086 Installing symlink pointing to librte_dispatcher.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.25 00:02:35.086 Installing symlink pointing to librte_dispatcher.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:35.086 Installing symlink pointing to librte_gpudev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.25 00:02:35.086 Installing symlink pointing to librte_gpudev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:35.086 Installing symlink pointing to librte_gro.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.25 00:02:35.086 Installing symlink pointing to librte_gro.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:35.086 Installing symlink pointing to librte_gso.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.25 00:02:35.086 Installing symlink pointing to librte_gso.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:35.086 Installing symlink pointing to librte_ip_frag.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.25 00:02:35.086 Installing symlink pointing to librte_ip_frag.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:35.086 Installing symlink pointing to librte_jobstats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.25 00:02:35.086 Installing symlink pointing to librte_jobstats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:35.086 Installing symlink pointing to librte_latencystats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.25 00:02:35.086 Installing symlink pointing to librte_latencystats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:35.086 Installing symlink pointing to librte_lpm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.25 00:02:35.087 Installing symlink pointing to librte_lpm.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:35.087 Installing symlink pointing to librte_member.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.25 00:02:35.087 Installing symlink pointing to librte_member.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:35.087 Installing symlink pointing to librte_pcapng.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.25 00:02:35.087 Installing symlink pointing to librte_pcapng.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:35.087 Installing symlink pointing to librte_power.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.25 00:02:35.087 Installing symlink pointing to librte_power.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:35.087 Installing symlink pointing to librte_rawdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.25 00:02:35.087 Installing symlink pointing to librte_rawdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:35.087 Installing symlink pointing to librte_regexdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.25 00:02:35.087 Installing symlink pointing to librte_regexdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:35.087 Installing symlink pointing to librte_mldev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.25 00:02:35.087 Installing symlink pointing to librte_mldev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:35.087 Installing symlink pointing to librte_rib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.25 00:02:35.087 Installing symlink pointing to librte_rib.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:35.087 Installing symlink pointing to librte_reorder.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.25 00:02:35.087 Installing symlink pointing to librte_reorder.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:35.087 Installing symlink pointing to librte_sched.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.25 00:02:35.087 Installing symlink pointing to librte_sched.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:35.087 Installing symlink pointing to librte_security.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.25 00:02:35.087 Installing symlink pointing to librte_security.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:35.087 Installing symlink pointing to librte_stack.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.25 00:02:35.087 './librte_bus_pci.so' -> 'dpdk/pmds-25.0/librte_bus_pci.so' 00:02:35.087 './librte_bus_pci.so.25' -> 'dpdk/pmds-25.0/librte_bus_pci.so.25' 00:02:35.087 './librte_bus_pci.so.25.0' -> 'dpdk/pmds-25.0/librte_bus_pci.so.25.0' 00:02:35.087 './librte_bus_vdev.so' -> 'dpdk/pmds-25.0/librte_bus_vdev.so' 00:02:35.087 './librte_bus_vdev.so.25' -> 'dpdk/pmds-25.0/librte_bus_vdev.so.25' 00:02:35.087 './librte_bus_vdev.so.25.0' -> 'dpdk/pmds-25.0/librte_bus_vdev.so.25.0' 00:02:35.087 './librte_mempool_ring.so' -> 'dpdk/pmds-25.0/librte_mempool_ring.so' 00:02:35.087 './librte_mempool_ring.so.25' -> 'dpdk/pmds-25.0/librte_mempool_ring.so.25' 00:02:35.087 './librte_mempool_ring.so.25.0' -> 'dpdk/pmds-25.0/librte_mempool_ring.so.25.0' 00:02:35.087 './librte_net_i40e.so' -> 'dpdk/pmds-25.0/librte_net_i40e.so' 00:02:35.087 './librte_net_i40e.so.25' -> 'dpdk/pmds-25.0/librte_net_i40e.so.25' 00:02:35.087 './librte_net_i40e.so.25.0' -> 'dpdk/pmds-25.0/librte_net_i40e.so.25.0' 00:02:35.087 Installing symlink pointing to librte_stack.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:35.087 Installing symlink pointing to librte_vhost.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.25 00:02:35.087 Installing symlink pointing to librte_vhost.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:35.087 Installing symlink pointing to librte_ipsec.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.25 00:02:35.087 Installing symlink pointing to librte_ipsec.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:35.087 Installing symlink pointing to librte_pdcp.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.25 00:02:35.087 Installing symlink pointing to librte_pdcp.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:35.087 Installing symlink pointing to librte_fib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.25 00:02:35.087 Installing symlink pointing to librte_fib.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:35.087 Installing symlink pointing to librte_port.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.25 00:02:35.087 Installing symlink pointing to librte_port.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:35.087 Installing symlink pointing to librte_pdump.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.25 00:02:35.087 Installing symlink pointing to librte_pdump.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:35.087 Installing symlink pointing to librte_table.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.25 00:02:35.087 Installing symlink pointing to librte_table.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:35.087 Installing symlink pointing to librte_pipeline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.25 00:02:35.087 Installing symlink pointing to librte_pipeline.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:35.087 Installing symlink pointing to librte_graph.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.25 00:02:35.087 Installing symlink pointing to librte_graph.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:35.087 Installing symlink pointing to librte_node.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.25 00:02:35.087 Installing symlink pointing to librte_node.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:35.087 Installing symlink pointing to librte_bus_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_pci.so.25 00:02:35.087 Installing symlink pointing to librte_bus_pci.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_pci.so 00:02:35.087 Installing symlink pointing to librte_bus_vdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_vdev.so.25 00:02:35.087 Installing symlink pointing to librte_bus_vdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_vdev.so 00:02:35.087 Installing symlink pointing to librte_mempool_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_mempool_ring.so.25 00:02:35.087 Installing symlink pointing to librte_mempool_ring.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_mempool_ring.so 00:02:35.087 Installing symlink pointing to librte_net_i40e.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_net_i40e.so.25 00:02:35.087 Installing symlink pointing to librte_net_i40e.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_net_i40e.so 00:02:35.087 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-25.0' 00:02:35.347 21:27:36 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:35.347 21:27:36 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:35.347 00:02:35.347 real 0m27.872s 00:02:35.347 user 8m22.985s 00:02:35.347 sys 2m35.441s 00:02:35.347 21:27:36 build_native_dpdk -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:35.347 21:27:36 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:35.347 ************************************ 00:02:35.347 END TEST build_native_dpdk 00:02:35.347 ************************************ 00:02:35.347 21:27:36 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:35.347 21:27:36 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:35.347 21:27:36 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:35.347 21:27:36 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:35.347 21:27:36 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:35.347 21:27:36 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:35.347 21:27:36 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:35.347 21:27:36 -- common/autotest_common.sh@10 -- $ set +x 00:02:35.347 ************************************ 00:02:35.347 START TEST autobuild_llvm_precompile 00:02:35.347 ************************************ 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autotest_common.sh@1125 -- $ _llvm_precompile 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:35.347 Target: x86_64-redhat-linux-gnu 00:02:35.347 Thread model: posix 00:02:35.347 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:35.347 21:27:36 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:35.607 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:35.865 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:35.866 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:35.866 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:36.124 Using 'verbs' RDMA provider 00:02:52.492 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:04.708 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:04.708 Creating mk/config.mk...done. 00:03:04.708 Creating mk/cc.flags.mk...done. 00:03:04.708 Type 'make' to build. 00:03:04.708 00:03:04.708 real 0m29.474s 00:03:04.708 user 0m12.835s 00:03:04.708 sys 0m16.016s 00:03:04.708 21:28:06 autobuild_llvm_precompile -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:04.708 21:28:06 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:04.708 ************************************ 00:03:04.708 END TEST autobuild_llvm_precompile 00:03:04.708 ************************************ 00:03:04.708 21:28:06 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:04.708 21:28:06 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:04.708 21:28:06 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:04.708 21:28:06 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:04.708 21:28:06 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:04.968 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:05.227 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:05.227 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:05.227 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:05.795 Using 'verbs' RDMA provider 00:03:18.943 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:28.925 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:29.494 Creating mk/config.mk...done. 00:03:29.494 Creating mk/cc.flags.mk...done. 00:03:29.494 Type 'make' to build. 00:03:29.494 21:28:31 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:29.494 21:28:31 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:03:29.494 21:28:31 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:03:29.494 21:28:31 -- common/autotest_common.sh@10 -- $ set +x 00:03:29.494 ************************************ 00:03:29.494 START TEST make 00:03:29.494 ************************************ 00:03:29.494 21:28:31 make -- common/autotest_common.sh@1125 -- $ make -j112 00:03:30.060 make[1]: Nothing to be done for 'all'. 00:03:31.443 The Meson build system 00:03:31.443 Version: 1.5.0 00:03:31.443 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:31.443 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:31.443 Build type: native build 00:03:31.443 Project name: libvfio-user 00:03:31.443 Project version: 0.0.1 00:03:31.443 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:31.443 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:31.443 Host machine cpu family: x86_64 00:03:31.443 Host machine cpu: x86_64 00:03:31.443 Run-time dependency threads found: YES 00:03:31.443 Library dl found: YES 00:03:31.443 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:31.443 Run-time dependency json-c found: YES 0.17 00:03:31.443 Run-time dependency cmocka found: YES 1.1.7 00:03:31.443 Program pytest-3 found: NO 00:03:31.443 Program flake8 found: NO 00:03:31.443 Program misspell-fixer found: NO 00:03:31.443 Program restructuredtext-lint found: NO 00:03:31.443 Program valgrind found: YES (/usr/bin/valgrind) 00:03:31.443 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:31.443 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:31.443 Compiler for C supports arguments -Wwrite-strings: YES 00:03:31.443 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:31.443 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:31.443 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:31.443 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:31.443 Build targets in project: 8 00:03:31.443 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:31.443 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:31.443 00:03:31.443 libvfio-user 0.0.1 00:03:31.443 00:03:31.443 User defined options 00:03:31.443 buildtype : debug 00:03:31.443 default_library: static 00:03:31.443 libdir : /usr/local/lib 00:03:31.443 00:03:31.443 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:32.011 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:32.011 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:32.011 [2/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:32.011 [3/36] Compiling C object samples/null.p/null.c.o 00:03:32.011 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:32.011 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:32.011 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:32.011 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:32.011 [8/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:32.011 [9/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:32.011 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:32.011 [11/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:32.011 [12/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:32.012 [13/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:32.012 [14/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:32.012 [15/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:32.012 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:32.012 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:32.012 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:32.012 [19/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:32.012 [20/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:32.012 [21/36] Compiling C object samples/server.p/server.c.o 00:03:32.012 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:32.012 [23/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:32.012 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:32.012 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:32.012 [26/36] Compiling C object samples/client.p/client.c.o 00:03:32.012 [27/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:32.012 [28/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:32.012 [29/36] Linking static target lib/libvfio-user.a 00:03:32.012 [30/36] Linking target samples/client 00:03:32.012 [31/36] Linking target test/unit_tests 00:03:32.012 [32/36] Linking target samples/shadow_ioeventfd_server 00:03:32.012 [33/36] Linking target samples/null 00:03:32.012 [34/36] Linking target samples/server 00:03:32.012 [35/36] Linking target samples/gpio-pci-idio-16 00:03:32.012 [36/36] Linking target samples/lspci 00:03:32.012 INFO: autodetecting backend as ninja 00:03:32.012 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:32.271 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:32.529 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:32.529 ninja: no work to do. 00:03:44.739 CC lib/ut_mock/mock.o 00:03:44.740 CC lib/ut/ut.o 00:03:44.740 CC lib/log/log.o 00:03:44.740 CC lib/log/log_flags.o 00:03:44.740 CC lib/log/log_deprecated.o 00:03:44.740 LIB libspdk_ut.a 00:03:44.740 LIB libspdk_ut_mock.a 00:03:44.740 LIB libspdk_log.a 00:03:45.307 CC lib/ioat/ioat.o 00:03:45.307 CC lib/dma/dma.o 00:03:45.307 CC lib/util/base64.o 00:03:45.307 CC lib/util/bit_array.o 00:03:45.307 CC lib/util/cpuset.o 00:03:45.307 CXX lib/trace_parser/trace.o 00:03:45.307 CC lib/util/crc16.o 00:03:45.307 CC lib/util/crc32.o 00:03:45.307 CC lib/util/crc32c.o 00:03:45.307 CC lib/util/crc32_ieee.o 00:03:45.307 CC lib/util/crc64.o 00:03:45.307 CC lib/util/dif.o 00:03:45.307 CC lib/util/file.o 00:03:45.307 CC lib/util/fd.o 00:03:45.307 CC lib/util/fd_group.o 00:03:45.307 CC lib/util/hexlify.o 00:03:45.307 CC lib/util/iov.o 00:03:45.307 CC lib/util/math.o 00:03:45.307 CC lib/util/net.o 00:03:45.307 CC lib/util/pipe.o 00:03:45.307 CC lib/util/strerror_tls.o 00:03:45.307 CC lib/util/string.o 00:03:45.307 CC lib/util/uuid.o 00:03:45.307 CC lib/util/xor.o 00:03:45.307 CC lib/util/zipf.o 00:03:45.307 CC lib/util/md5.o 00:03:45.307 CC lib/vfio_user/host/vfio_user_pci.o 00:03:45.307 CC lib/vfio_user/host/vfio_user.o 00:03:45.307 LIB libspdk_dma.a 00:03:45.307 LIB libspdk_ioat.a 00:03:45.566 LIB libspdk_vfio_user.a 00:03:45.566 LIB libspdk_util.a 00:03:45.566 LIB libspdk_trace_parser.a 00:03:45.824 CC lib/json/json_parse.o 00:03:45.824 CC lib/conf/conf.o 00:03:45.824 CC lib/json/json_util.o 00:03:45.824 CC lib/json/json_write.o 00:03:45.824 CC lib/idxd/idxd.o 00:03:45.824 CC lib/idxd/idxd_user.o 00:03:45.824 CC lib/idxd/idxd_kernel.o 00:03:45.824 CC lib/rdma_utils/rdma_utils.o 00:03:45.824 CC lib/env_dpdk/memory.o 00:03:45.824 CC lib/env_dpdk/env.o 00:03:45.824 CC lib/env_dpdk/init.o 00:03:45.824 CC lib/env_dpdk/pci.o 00:03:45.824 CC lib/env_dpdk/pci_ioat.o 00:03:45.824 CC lib/env_dpdk/threads.o 00:03:45.824 CC lib/env_dpdk/pci_vmd.o 00:03:45.824 CC lib/env_dpdk/pci_virtio.o 00:03:45.824 CC lib/env_dpdk/pci_event.o 00:03:45.824 CC lib/env_dpdk/pci_idxd.o 00:03:45.824 CC lib/env_dpdk/pci_dpdk.o 00:03:45.824 CC lib/env_dpdk/sigbus_handler.o 00:03:45.824 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:45.824 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:45.824 CC lib/vmd/led.o 00:03:45.824 CC lib/vmd/vmd.o 00:03:45.824 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:45.824 CC lib/rdma_provider/common.o 00:03:45.824 LIB libspdk_conf.a 00:03:46.082 LIB libspdk_rdma_provider.a 00:03:46.082 LIB libspdk_rdma_utils.a 00:03:46.082 LIB libspdk_json.a 00:03:46.082 LIB libspdk_idxd.a 00:03:46.082 LIB libspdk_vmd.a 00:03:46.341 CC lib/jsonrpc/jsonrpc_server.o 00:03:46.341 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:46.341 CC lib/jsonrpc/jsonrpc_client.o 00:03:46.341 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:46.341 LIB libspdk_jsonrpc.a 00:03:46.600 LIB libspdk_env_dpdk.a 00:03:46.859 CC lib/rpc/rpc.o 00:03:46.859 LIB libspdk_rpc.a 00:03:47.117 CC lib/trace/trace.o 00:03:47.117 CC lib/trace/trace_rpc.o 00:03:47.117 CC lib/trace/trace_flags.o 00:03:47.117 CC lib/keyring/keyring.o 00:03:47.117 CC lib/notify/notify.o 00:03:47.117 CC lib/keyring/keyring_rpc.o 00:03:47.117 CC lib/notify/notify_rpc.o 00:03:47.375 LIB libspdk_notify.a 00:03:47.375 LIB libspdk_trace.a 00:03:47.375 LIB libspdk_keyring.a 00:03:47.634 CC lib/sock/sock.o 00:03:47.634 CC lib/sock/sock_rpc.o 00:03:47.634 CC lib/thread/thread.o 00:03:47.634 CC lib/thread/iobuf.o 00:03:47.892 LIB libspdk_sock.a 00:03:48.150 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:48.150 CC lib/nvme/nvme_ctrlr.o 00:03:48.150 CC lib/nvme/nvme_fabric.o 00:03:48.150 CC lib/nvme/nvme_ns_cmd.o 00:03:48.150 CC lib/nvme/nvme_ns.o 00:03:48.150 CC lib/nvme/nvme_pcie_common.o 00:03:48.150 CC lib/nvme/nvme_pcie.o 00:03:48.150 CC lib/nvme/nvme_qpair.o 00:03:48.150 CC lib/nvme/nvme.o 00:03:48.150 CC lib/nvme/nvme_quirks.o 00:03:48.150 CC lib/nvme/nvme_transport.o 00:03:48.150 CC lib/nvme/nvme_discovery.o 00:03:48.150 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:48.150 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:48.150 CC lib/nvme/nvme_tcp.o 00:03:48.150 CC lib/nvme/nvme_opal.o 00:03:48.150 CC lib/nvme/nvme_io_msg.o 00:03:48.150 CC lib/nvme/nvme_poll_group.o 00:03:48.150 CC lib/nvme/nvme_stubs.o 00:03:48.150 CC lib/nvme/nvme_auth.o 00:03:48.150 CC lib/nvme/nvme_zns.o 00:03:48.150 CC lib/nvme/nvme_cuse.o 00:03:48.150 CC lib/nvme/nvme_vfio_user.o 00:03:48.150 CC lib/nvme/nvme_rdma.o 00:03:48.409 LIB libspdk_thread.a 00:03:48.668 CC lib/virtio/virtio.o 00:03:48.668 CC lib/virtio/virtio_vhost_user.o 00:03:48.668 CC lib/virtio/virtio_vfio_user.o 00:03:48.668 CC lib/virtio/virtio_pci.o 00:03:48.668 CC lib/blob/blobstore.o 00:03:48.668 CC lib/init/json_config.o 00:03:48.668 CC lib/blob/zeroes.o 00:03:48.668 CC lib/blob/request.o 00:03:48.668 CC lib/init/subsystem_rpc.o 00:03:48.668 CC lib/init/subsystem.o 00:03:48.668 CC lib/accel/accel_sw.o 00:03:48.668 CC lib/accel/accel.o 00:03:48.668 CC lib/blob/blob_bs_dev.o 00:03:48.668 CC lib/init/rpc.o 00:03:48.668 CC lib/accel/accel_rpc.o 00:03:48.668 CC lib/fsdev/fsdev.o 00:03:48.668 CC lib/fsdev/fsdev_io.o 00:03:48.668 CC lib/fsdev/fsdev_rpc.o 00:03:48.668 CC lib/vfu_tgt/tgt_endpoint.o 00:03:48.668 CC lib/vfu_tgt/tgt_rpc.o 00:03:48.926 LIB libspdk_init.a 00:03:48.926 LIB libspdk_virtio.a 00:03:48.926 LIB libspdk_vfu_tgt.a 00:03:48.926 LIB libspdk_fsdev.a 00:03:49.185 CC lib/event/app.o 00:03:49.185 CC lib/event/reactor.o 00:03:49.185 CC lib/event/log_rpc.o 00:03:49.185 CC lib/event/app_rpc.o 00:03:49.185 CC lib/event/scheduler_static.o 00:03:49.443 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:49.443 LIB libspdk_event.a 00:03:49.443 LIB libspdk_accel.a 00:03:49.443 LIB libspdk_nvme.a 00:03:49.701 CC lib/bdev/bdev.o 00:03:49.701 CC lib/bdev/bdev_rpc.o 00:03:49.701 CC lib/bdev/bdev_zone.o 00:03:49.701 CC lib/bdev/part.o 00:03:49.701 CC lib/bdev/scsi_nvme.o 00:03:49.701 LIB libspdk_fuse_dispatcher.a 00:03:50.270 LIB libspdk_blob.a 00:03:50.529 CC lib/lvol/lvol.o 00:03:50.529 CC lib/blobfs/blobfs.o 00:03:50.529 CC lib/blobfs/tree.o 00:03:51.274 LIB libspdk_lvol.a 00:03:51.274 LIB libspdk_blobfs.a 00:03:51.274 LIB libspdk_bdev.a 00:03:51.566 CC lib/ublk/ublk.o 00:03:51.566 CC lib/ublk/ublk_rpc.o 00:03:51.566 CC lib/nvmf/ctrlr.o 00:03:51.566 CC lib/nvmf/ctrlr_discovery.o 00:03:51.566 CC lib/nbd/nbd.o 00:03:51.566 CC lib/nvmf/ctrlr_bdev.o 00:03:51.566 CC lib/nbd/nbd_rpc.o 00:03:51.566 CC lib/nvmf/subsystem.o 00:03:51.566 CC lib/nvmf/nvmf.o 00:03:51.566 CC lib/nvmf/nvmf_rpc.o 00:03:51.566 CC lib/nvmf/transport.o 00:03:51.566 CC lib/nvmf/tcp.o 00:03:51.566 CC lib/nvmf/stubs.o 00:03:51.566 CC lib/nvmf/mdns_server.o 00:03:51.566 CC lib/scsi/dev.o 00:03:51.566 CC lib/nvmf/vfio_user.o 00:03:51.566 CC lib/scsi/lun.o 00:03:51.566 CC lib/nvmf/rdma.o 00:03:51.566 CC lib/scsi/port.o 00:03:51.566 CC lib/nvmf/auth.o 00:03:51.566 CC lib/scsi/scsi.o 00:03:51.566 CC lib/scsi/scsi_bdev.o 00:03:51.566 CC lib/scsi/scsi_pr.o 00:03:51.566 CC lib/scsi/scsi_rpc.o 00:03:51.566 CC lib/scsi/task.o 00:03:51.566 CC lib/ftl/ftl_core.o 00:03:51.566 CC lib/ftl/ftl_init.o 00:03:51.566 CC lib/ftl/ftl_layout.o 00:03:51.566 CC lib/ftl/ftl_debug.o 00:03:51.566 CC lib/ftl/ftl_l2p.o 00:03:51.566 CC lib/ftl/ftl_io.o 00:03:51.566 CC lib/ftl/ftl_sb.o 00:03:51.566 CC lib/ftl/ftl_l2p_flat.o 00:03:51.566 CC lib/ftl/ftl_band_ops.o 00:03:51.566 CC lib/ftl/ftl_nv_cache.o 00:03:51.566 CC lib/ftl/ftl_band.o 00:03:51.566 CC lib/ftl/ftl_writer.o 00:03:51.566 CC lib/ftl/ftl_rq.o 00:03:51.566 CC lib/ftl/ftl_reloc.o 00:03:51.566 CC lib/ftl/ftl_l2p_cache.o 00:03:51.566 CC lib/ftl/ftl_p2l.o 00:03:51.566 CC lib/ftl/ftl_p2l_log.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:51.566 CC lib/ftl/utils/ftl_conf.o 00:03:51.566 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:51.566 CC lib/ftl/utils/ftl_md.o 00:03:51.566 CC lib/ftl/utils/ftl_bitmap.o 00:03:51.566 CC lib/ftl/utils/ftl_mempool.o 00:03:51.566 CC lib/ftl/utils/ftl_property.o 00:03:51.566 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:51.566 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:51.566 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:51.566 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:51.566 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:51.566 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:51.566 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:51.566 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:51.825 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:51.825 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:51.825 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:51.825 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:51.825 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:51.825 CC lib/ftl/base/ftl_base_dev.o 00:03:51.825 CC lib/ftl/base/ftl_base_bdev.o 00:03:51.825 CC lib/ftl/ftl_trace.o 00:03:51.825 LIB libspdk_nbd.a 00:03:52.084 LIB libspdk_scsi.a 00:03:52.084 LIB libspdk_ublk.a 00:03:52.344 CC lib/iscsi/conn.o 00:03:52.344 CC lib/iscsi/param.o 00:03:52.344 CC lib/iscsi/init_grp.o 00:03:52.344 CC lib/iscsi/iscsi.o 00:03:52.344 CC lib/iscsi/portal_grp.o 00:03:52.344 CC lib/iscsi/tgt_node.o 00:03:52.344 CC lib/iscsi/iscsi_subsystem.o 00:03:52.344 LIB libspdk_ftl.a 00:03:52.344 CC lib/iscsi/iscsi_rpc.o 00:03:52.344 CC lib/iscsi/task.o 00:03:52.344 CC lib/vhost/vhost.o 00:03:52.344 CC lib/vhost/vhost_rpc.o 00:03:52.344 CC lib/vhost/vhost_scsi.o 00:03:52.344 CC lib/vhost/vhost_blk.o 00:03:52.344 CC lib/vhost/rte_vhost_user.o 00:03:52.913 LIB libspdk_nvmf.a 00:03:52.913 LIB libspdk_vhost.a 00:03:53.172 LIB libspdk_iscsi.a 00:03:53.431 CC module/env_dpdk/env_dpdk_rpc.o 00:03:53.431 CC module/vfu_device/vfu_virtio.o 00:03:53.431 CC module/vfu_device/vfu_virtio_blk.o 00:03:53.431 CC module/vfu_device/vfu_virtio_scsi.o 00:03:53.431 CC module/vfu_device/vfu_virtio_rpc.o 00:03:53.431 CC module/vfu_device/vfu_virtio_fs.o 00:03:53.690 CC module/keyring/file/keyring.o 00:03:53.690 CC module/keyring/file/keyring_rpc.o 00:03:53.690 CC module/accel/ioat/accel_ioat_rpc.o 00:03:53.690 CC module/accel/ioat/accel_ioat.o 00:03:53.690 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:53.690 CC module/keyring/linux/keyring.o 00:03:53.690 CC module/keyring/linux/keyring_rpc.o 00:03:53.690 LIB libspdk_env_dpdk_rpc.a 00:03:53.690 CC module/blob/bdev/blob_bdev.o 00:03:53.690 CC module/sock/posix/posix.o 00:03:53.690 CC module/accel/dsa/accel_dsa.o 00:03:53.690 CC module/accel/dsa/accel_dsa_rpc.o 00:03:53.690 CC module/accel/error/accel_error.o 00:03:53.690 CC module/accel/error/accel_error_rpc.o 00:03:53.690 CC module/accel/iaa/accel_iaa_rpc.o 00:03:53.690 CC module/accel/iaa/accel_iaa.o 00:03:53.690 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:53.690 CC module/scheduler/gscheduler/gscheduler.o 00:03:53.690 CC module/fsdev/aio/fsdev_aio.o 00:03:53.690 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:53.690 CC module/fsdev/aio/linux_aio_mgr.o 00:03:53.690 LIB libspdk_keyring_file.a 00:03:53.690 LIB libspdk_keyring_linux.a 00:03:53.690 LIB libspdk_accel_ioat.a 00:03:53.690 LIB libspdk_scheduler_dynamic.a 00:03:53.690 LIB libspdk_scheduler_dpdk_governor.a 00:03:53.690 LIB libspdk_accel_error.a 00:03:53.690 LIB libspdk_scheduler_gscheduler.a 00:03:53.690 LIB libspdk_accel_iaa.a 00:03:53.949 LIB libspdk_blob_bdev.a 00:03:53.949 LIB libspdk_accel_dsa.a 00:03:53.949 LIB libspdk_vfu_device.a 00:03:53.949 LIB libspdk_sock_posix.a 00:03:54.208 LIB libspdk_fsdev_aio.a 00:03:54.208 CC module/bdev/split/vbdev_split.o 00:03:54.208 CC module/bdev/split/vbdev_split_rpc.o 00:03:54.208 CC module/bdev/error/vbdev_error_rpc.o 00:03:54.208 CC module/bdev/error/vbdev_error.o 00:03:54.208 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:54.208 CC module/blobfs/bdev/blobfs_bdev.o 00:03:54.208 CC module/bdev/gpt/vbdev_gpt.o 00:03:54.208 CC module/bdev/gpt/gpt.o 00:03:54.208 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:54.208 CC module/bdev/ftl/bdev_ftl.o 00:03:54.208 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:54.208 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:54.208 CC module/bdev/delay/vbdev_delay.o 00:03:54.208 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:54.208 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:54.208 CC module/bdev/passthru/vbdev_passthru.o 00:03:54.208 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:54.208 CC module/bdev/malloc/bdev_malloc.o 00:03:54.208 CC module/bdev/aio/bdev_aio.o 00:03:54.208 CC module/bdev/aio/bdev_aio_rpc.o 00:03:54.208 CC module/bdev/raid/bdev_raid.o 00:03:54.208 CC module/bdev/raid/bdev_raid_rpc.o 00:03:54.208 CC module/bdev/nvme/bdev_nvme.o 00:03:54.208 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:54.208 CC module/bdev/raid/bdev_raid_sb.o 00:03:54.208 CC module/bdev/nvme/nvme_rpc.o 00:03:54.208 CC module/bdev/raid/raid1.o 00:03:54.208 CC module/bdev/raid/raid0.o 00:03:54.208 CC module/bdev/nvme/bdev_mdns_client.o 00:03:54.208 CC module/bdev/raid/concat.o 00:03:54.208 CC module/bdev/nvme/vbdev_opal.o 00:03:54.208 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:54.208 CC module/bdev/iscsi/bdev_iscsi.o 00:03:54.208 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:54.208 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:54.208 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:54.208 CC module/bdev/lvol/vbdev_lvol.o 00:03:54.208 CC module/bdev/null/bdev_null.o 00:03:54.208 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:54.208 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:54.208 CC module/bdev/null/bdev_null_rpc.o 00:03:54.208 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:54.468 LIB libspdk_blobfs_bdev.a 00:03:54.468 LIB libspdk_bdev_split.a 00:03:54.468 LIB libspdk_bdev_error.a 00:03:54.468 LIB libspdk_bdev_gpt.a 00:03:54.468 LIB libspdk_bdev_ftl.a 00:03:54.468 LIB libspdk_bdev_passthru.a 00:03:54.468 LIB libspdk_bdev_null.a 00:03:54.468 LIB libspdk_bdev_zone_block.a 00:03:54.468 LIB libspdk_bdev_aio.a 00:03:54.468 LIB libspdk_bdev_iscsi.a 00:03:54.468 LIB libspdk_bdev_delay.a 00:03:54.468 LIB libspdk_bdev_malloc.a 00:03:54.727 LIB libspdk_bdev_lvol.a 00:03:54.727 LIB libspdk_bdev_virtio.a 00:03:54.727 LIB libspdk_bdev_raid.a 00:03:55.667 LIB libspdk_bdev_nvme.a 00:03:56.237 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:56.237 CC module/event/subsystems/vmd/vmd.o 00:03:56.237 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:56.237 CC module/event/subsystems/sock/sock.o 00:03:56.237 CC module/event/subsystems/iobuf/iobuf.o 00:03:56.237 CC module/event/subsystems/keyring/keyring.o 00:03:56.237 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:56.237 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:56.237 CC module/event/subsystems/fsdev/fsdev.o 00:03:56.237 CC module/event/subsystems/scheduler/scheduler.o 00:03:56.237 LIB libspdk_event_vhost_blk.a 00:03:56.237 LIB libspdk_event_vmd.a 00:03:56.237 LIB libspdk_event_sock.a 00:03:56.237 LIB libspdk_event_keyring.a 00:03:56.237 LIB libspdk_event_fsdev.a 00:03:56.237 LIB libspdk_event_vfu_tgt.a 00:03:56.237 LIB libspdk_event_scheduler.a 00:03:56.237 LIB libspdk_event_iobuf.a 00:03:56.806 CC module/event/subsystems/accel/accel.o 00:03:56.806 LIB libspdk_event_accel.a 00:03:57.064 CC module/event/subsystems/bdev/bdev.o 00:03:57.064 LIB libspdk_event_bdev.a 00:03:57.633 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:57.633 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:57.633 CC module/event/subsystems/scsi/scsi.o 00:03:57.633 CC module/event/subsystems/ublk/ublk.o 00:03:57.633 CC module/event/subsystems/nbd/nbd.o 00:03:57.633 LIB libspdk_event_ublk.a 00:03:57.633 LIB libspdk_event_scsi.a 00:03:57.633 LIB libspdk_event_nbd.a 00:03:57.633 LIB libspdk_event_nvmf.a 00:03:57.894 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:57.894 CC module/event/subsystems/iscsi/iscsi.o 00:03:57.894 LIB libspdk_event_vhost_scsi.a 00:03:58.153 LIB libspdk_event_iscsi.a 00:03:58.415 CC app/spdk_nvme_perf/perf.o 00:03:58.415 CC app/spdk_nvme_identify/identify.o 00:03:58.415 CC app/trace_record/trace_record.o 00:03:58.415 CC app/spdk_lspci/spdk_lspci.o 00:03:58.415 CC app/spdk_top/spdk_top.o 00:03:58.415 CC app/spdk_nvme_discover/discovery_aer.o 00:03:58.415 CXX app/trace/trace.o 00:03:58.415 TEST_HEADER include/spdk/accel.h 00:03:58.415 TEST_HEADER include/spdk/accel_module.h 00:03:58.415 TEST_HEADER include/spdk/assert.h 00:03:58.415 CC app/iscsi_tgt/iscsi_tgt.o 00:03:58.415 TEST_HEADER include/spdk/base64.h 00:03:58.415 TEST_HEADER include/spdk/barrier.h 00:03:58.415 TEST_HEADER include/spdk/bdev.h 00:03:58.415 CC test/rpc_client/rpc_client_test.o 00:03:58.415 TEST_HEADER include/spdk/bdev_zone.h 00:03:58.415 TEST_HEADER include/spdk/bit_array.h 00:03:58.415 TEST_HEADER include/spdk/bdev_module.h 00:03:58.415 TEST_HEADER include/spdk/bit_pool.h 00:03:58.415 TEST_HEADER include/spdk/blob.h 00:03:58.415 TEST_HEADER include/spdk/blob_bdev.h 00:03:58.415 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:58.415 TEST_HEADER include/spdk/conf.h 00:03:58.415 TEST_HEADER include/spdk/blobfs.h 00:03:58.415 TEST_HEADER include/spdk/config.h 00:03:58.415 TEST_HEADER include/spdk/crc64.h 00:03:58.415 TEST_HEADER include/spdk/cpuset.h 00:03:58.415 TEST_HEADER include/spdk/crc16.h 00:03:58.415 TEST_HEADER include/spdk/crc32.h 00:03:58.415 TEST_HEADER include/spdk/env_dpdk.h 00:03:58.415 TEST_HEADER include/spdk/dif.h 00:03:58.415 TEST_HEADER include/spdk/dma.h 00:03:58.415 TEST_HEADER include/spdk/endian.h 00:03:58.415 TEST_HEADER include/spdk/event.h 00:03:58.415 TEST_HEADER include/spdk/fd_group.h 00:03:58.415 TEST_HEADER include/spdk/env.h 00:03:58.415 TEST_HEADER include/spdk/file.h 00:03:58.415 CC app/spdk_dd/spdk_dd.o 00:03:58.415 TEST_HEADER include/spdk/fsdev.h 00:03:58.415 TEST_HEADER include/spdk/fd.h 00:03:58.415 TEST_HEADER include/spdk/gpt_spec.h 00:03:58.415 TEST_HEADER include/spdk/fsdev_module.h 00:03:58.416 TEST_HEADER include/spdk/hexlify.h 00:03:58.416 TEST_HEADER include/spdk/ftl.h 00:03:58.416 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:58.416 TEST_HEADER include/spdk/init.h 00:03:58.416 TEST_HEADER include/spdk/idxd_spec.h 00:03:58.416 TEST_HEADER include/spdk/histogram_data.h 00:03:58.416 TEST_HEADER include/spdk/ioat.h 00:03:58.416 TEST_HEADER include/spdk/idxd.h 00:03:58.416 TEST_HEADER include/spdk/jsonrpc.h 00:03:58.416 TEST_HEADER include/spdk/iscsi_spec.h 00:03:58.416 TEST_HEADER include/spdk/ioat_spec.h 00:03:58.416 TEST_HEADER include/spdk/json.h 00:03:58.416 TEST_HEADER include/spdk/keyring_module.h 00:03:58.416 TEST_HEADER include/spdk/keyring.h 00:03:58.416 TEST_HEADER include/spdk/log.h 00:03:58.416 TEST_HEADER include/spdk/likely.h 00:03:58.416 TEST_HEADER include/spdk/memory.h 00:03:58.416 TEST_HEADER include/spdk/lvol.h 00:03:58.416 TEST_HEADER include/spdk/md5.h 00:03:58.416 TEST_HEADER include/spdk/nbd.h 00:03:58.416 TEST_HEADER include/spdk/net.h 00:03:58.416 TEST_HEADER include/spdk/mmio.h 00:03:58.416 TEST_HEADER include/spdk/notify.h 00:03:58.416 TEST_HEADER include/spdk/nvme.h 00:03:58.416 TEST_HEADER include/spdk/nvme_intel.h 00:03:58.416 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:58.416 TEST_HEADER include/spdk/nvme_spec.h 00:03:58.416 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:58.416 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:58.416 CC app/nvmf_tgt/nvmf_main.o 00:03:58.416 TEST_HEADER include/spdk/nvme_zns.h 00:03:58.416 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:58.416 TEST_HEADER include/spdk/nvmf_spec.h 00:03:58.416 TEST_HEADER include/spdk/nvmf.h 00:03:58.416 TEST_HEADER include/spdk/opal.h 00:03:58.416 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:58.416 TEST_HEADER include/spdk/opal_spec.h 00:03:58.416 TEST_HEADER include/spdk/nvmf_transport.h 00:03:58.416 TEST_HEADER include/spdk/pipe.h 00:03:58.416 TEST_HEADER include/spdk/queue.h 00:03:58.416 CC app/spdk_tgt/spdk_tgt.o 00:03:58.416 TEST_HEADER include/spdk/pci_ids.h 00:03:58.416 TEST_HEADER include/spdk/reduce.h 00:03:58.416 TEST_HEADER include/spdk/rpc.h 00:03:58.416 TEST_HEADER include/spdk/scheduler.h 00:03:58.416 TEST_HEADER include/spdk/scsi_spec.h 00:03:58.416 TEST_HEADER include/spdk/scsi.h 00:03:58.416 TEST_HEADER include/spdk/stdinc.h 00:03:58.416 TEST_HEADER include/spdk/sock.h 00:03:58.416 TEST_HEADER include/spdk/string.h 00:03:58.416 TEST_HEADER include/spdk/trace.h 00:03:58.416 TEST_HEADER include/spdk/thread.h 00:03:58.416 TEST_HEADER include/spdk/trace_parser.h 00:03:58.416 TEST_HEADER include/spdk/util.h 00:03:58.416 TEST_HEADER include/spdk/ublk.h 00:03:58.416 TEST_HEADER include/spdk/tree.h 00:03:58.416 TEST_HEADER include/spdk/uuid.h 00:03:58.416 TEST_HEADER include/spdk/version.h 00:03:58.416 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:58.416 TEST_HEADER include/spdk/vhost.h 00:03:58.416 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:58.416 TEST_HEADER include/spdk/zipf.h 00:03:58.416 CXX test/cpp_headers/accel.o 00:03:58.416 TEST_HEADER include/spdk/xor.h 00:03:58.416 TEST_HEADER include/spdk/vmd.h 00:03:58.416 CXX test/cpp_headers/assert.o 00:03:58.416 CXX test/cpp_headers/accel_module.o 00:03:58.416 CXX test/cpp_headers/barrier.o 00:03:58.416 CXX test/cpp_headers/base64.o 00:03:58.416 CXX test/cpp_headers/bdev.o 00:03:58.416 CXX test/cpp_headers/bdev_zone.o 00:03:58.416 CXX test/cpp_headers/bdev_module.o 00:03:58.416 CXX test/cpp_headers/bit_array.o 00:03:58.416 CXX test/cpp_headers/bit_pool.o 00:03:58.416 CXX test/cpp_headers/blob_bdev.o 00:03:58.416 CXX test/cpp_headers/blobfs_bdev.o 00:03:58.416 CXX test/cpp_headers/blob.o 00:03:58.416 CXX test/cpp_headers/conf.o 00:03:58.416 CXX test/cpp_headers/config.o 00:03:58.416 CXX test/cpp_headers/blobfs.o 00:03:58.416 CXX test/cpp_headers/cpuset.o 00:03:58.416 CXX test/cpp_headers/crc16.o 00:03:58.416 CXX test/cpp_headers/crc64.o 00:03:58.416 CXX test/cpp_headers/crc32.o 00:03:58.416 CXX test/cpp_headers/dma.o 00:03:58.416 CXX test/cpp_headers/endian.o 00:03:58.416 CXX test/cpp_headers/dif.o 00:03:58.416 CXX test/cpp_headers/env_dpdk.o 00:03:58.416 CXX test/cpp_headers/fd_group.o 00:03:58.416 CXX test/cpp_headers/event.o 00:03:58.416 CXX test/cpp_headers/file.o 00:03:58.416 CXX test/cpp_headers/fd.o 00:03:58.416 CXX test/cpp_headers/env.o 00:03:58.416 CXX test/cpp_headers/ftl.o 00:03:58.416 CXX test/cpp_headers/fsdev_module.o 00:03:58.416 CXX test/cpp_headers/fsdev.o 00:03:58.416 CXX test/cpp_headers/gpt_spec.o 00:03:58.416 CXX test/cpp_headers/hexlify.o 00:03:58.416 CXX test/cpp_headers/fuse_dispatcher.o 00:03:58.416 CXX test/cpp_headers/histogram_data.o 00:03:58.416 CXX test/cpp_headers/idxd_spec.o 00:03:58.416 CXX test/cpp_headers/idxd.o 00:03:58.416 CXX test/cpp_headers/ioat.o 00:03:58.416 CXX test/cpp_headers/ioat_spec.o 00:03:58.416 CXX test/cpp_headers/iscsi_spec.o 00:03:58.416 CXX test/cpp_headers/init.o 00:03:58.416 CXX test/cpp_headers/json.o 00:03:58.416 CXX test/cpp_headers/jsonrpc.o 00:03:58.416 LINK spdk_lspci 00:03:58.416 CXX test/cpp_headers/keyring.o 00:03:58.416 CXX test/cpp_headers/keyring_module.o 00:03:58.416 CXX test/cpp_headers/likely.o 00:03:58.416 CXX test/cpp_headers/log.o 00:03:58.416 CC examples/util/zipf/zipf.o 00:03:58.416 CXX test/cpp_headers/lvol.o 00:03:58.416 CXX test/cpp_headers/md5.o 00:03:58.416 CXX test/cpp_headers/memory.o 00:03:58.416 CXX test/cpp_headers/nbd.o 00:03:58.416 CXX test/cpp_headers/mmio.o 00:03:58.416 CXX test/cpp_headers/net.o 00:03:58.416 CXX test/cpp_headers/notify.o 00:03:58.416 CXX test/cpp_headers/nvme.o 00:03:58.416 CXX test/cpp_headers/nvme_intel.o 00:03:58.416 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:58.416 CXX test/cpp_headers/nvme_ocssd.o 00:03:58.416 CC app/fio/nvme/fio_plugin.o 00:03:58.416 CXX test/cpp_headers/nvme_spec.o 00:03:58.416 CXX test/cpp_headers/nvmf_cmd.o 00:03:58.416 CXX test/cpp_headers/nvmf.o 00:03:58.416 CXX test/cpp_headers/nvme_zns.o 00:03:58.416 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:58.416 CXX test/cpp_headers/nvmf_spec.o 00:03:58.416 CXX test/cpp_headers/nvmf_transport.o 00:03:58.416 CC test/thread/poller_perf/poller_perf.o 00:03:58.416 CXX test/cpp_headers/opal.o 00:03:58.416 CXX test/cpp_headers/opal_spec.o 00:03:58.416 CXX test/cpp_headers/pci_ids.o 00:03:58.416 CXX test/cpp_headers/pipe.o 00:03:58.416 CXX test/cpp_headers/queue.o 00:03:58.416 CC test/env/pci/pci_ut.o 00:03:58.416 CC test/app/jsoncat/jsoncat.o 00:03:58.416 CXX test/cpp_headers/reduce.o 00:03:58.416 CXX test/cpp_headers/rpc.o 00:03:58.416 CXX test/cpp_headers/scheduler.o 00:03:58.416 CC test/env/vtophys/vtophys.o 00:03:58.416 CXX test/cpp_headers/scsi.o 00:03:58.416 CXX test/cpp_headers/scsi_spec.o 00:03:58.416 CXX test/cpp_headers/sock.o 00:03:58.416 CXX test/cpp_headers/stdinc.o 00:03:58.416 CXX test/cpp_headers/string.o 00:03:58.416 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:58.416 CC test/env/memory/memory_ut.o 00:03:58.416 CXX test/cpp_headers/thread.o 00:03:58.416 CXX test/cpp_headers/trace.o 00:03:58.416 CXX test/cpp_headers/trace_parser.o 00:03:58.416 CC examples/ioat/verify/verify.o 00:03:58.416 CC test/app/stub/stub.o 00:03:58.416 CC test/thread/lock/spdk_lock.o 00:03:58.416 CC test/app/histogram_perf/histogram_perf.o 00:03:58.416 CC examples/ioat/perf/perf.o 00:03:58.416 CXX test/cpp_headers/tree.o 00:03:58.416 CC test/dma/test_dma/test_dma.o 00:03:58.416 CXX test/cpp_headers/ublk.o 00:03:58.416 LINK rpc_client_test 00:03:58.675 LINK spdk_nvme_discover 00:03:58.675 CC test/app/bdev_svc/bdev_svc.o 00:03:58.675 CC app/fio/bdev/fio_plugin.o 00:03:58.675 LINK spdk_trace_record 00:03:58.675 CC test/env/mem_callbacks/mem_callbacks.o 00:03:58.675 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:58.675 CXX test/cpp_headers/util.o 00:03:58.675 CXX test/cpp_headers/uuid.o 00:03:58.675 CXX test/cpp_headers/version.o 00:03:58.675 LINK nvmf_tgt 00:03:58.675 LINK iscsi_tgt 00:03:58.675 CXX test/cpp_headers/vfio_user_pci.o 00:03:58.675 LINK interrupt_tgt 00:03:58.675 LINK zipf 00:03:58.675 CXX test/cpp_headers/vfio_user_spec.o 00:03:58.675 CXX test/cpp_headers/vhost.o 00:03:58.675 CXX test/cpp_headers/vmd.o 00:03:58.675 CXX test/cpp_headers/xor.o 00:03:58.675 CXX test/cpp_headers/zipf.o 00:03:58.675 LINK vtophys 00:03:58.675 LINK jsoncat 00:03:58.675 LINK poller_perf 00:03:58.675 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:58.675 LINK spdk_tgt 00:03:58.675 LINK histogram_perf 00:03:58.675 LINK env_dpdk_post_init 00:03:58.675 LINK stub 00:03:58.675 LINK ioat_perf 00:03:58.675 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:58.675 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:58.675 LINK verify 00:03:58.675 LINK bdev_svc 00:03:58.675 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:58.675 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:58.934 LINK spdk_trace 00:03:58.934 LINK spdk_dd 00:03:58.934 LINK pci_ut 00:03:58.934 LINK spdk_nvme_identify 00:03:58.934 LINK test_dma 00:03:58.934 LINK nvme_fuzz 00:03:58.934 LINK spdk_nvme 00:03:58.934 LINK spdk_nvme_perf 00:03:58.934 LINK spdk_bdev 00:03:58.934 LINK llvm_vfio_fuzz 00:03:58.934 LINK vhost_fuzz 00:03:58.934 LINK mem_callbacks 00:03:59.193 LINK spdk_top 00:03:59.193 CC examples/sock/hello_world/hello_sock.o 00:03:59.193 LINK llvm_nvme_fuzz 00:03:59.193 CC examples/vmd/lsvmd/lsvmd.o 00:03:59.193 CC examples/vmd/led/led.o 00:03:59.193 CC app/vhost/vhost.o 00:03:59.193 CC examples/idxd/perf/perf.o 00:03:59.450 CC examples/thread/thread/thread_ex.o 00:03:59.450 LINK memory_ut 00:03:59.450 LINK lsvmd 00:03:59.450 LINK led 00:03:59.450 LINK hello_sock 00:03:59.450 LINK vhost 00:03:59.450 LINK idxd_perf 00:03:59.450 LINK spdk_lock 00:03:59.450 LINK thread 00:03:59.708 LINK iscsi_fuzz 00:04:00.275 CC examples/nvme/abort/abort.o 00:04:00.275 CC examples/nvme/hotplug/hotplug.o 00:04:00.275 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:00.275 CC examples/nvme/hello_world/hello_world.o 00:04:00.275 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:00.275 CC examples/nvme/reconnect/reconnect.o 00:04:00.275 CC examples/nvme/arbitration/arbitration.o 00:04:00.275 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:00.275 CC test/event/app_repeat/app_repeat.o 00:04:00.275 CC test/event/reactor/reactor.o 00:04:00.275 CC test/event/reactor_perf/reactor_perf.o 00:04:00.275 CC test/event/event_perf/event_perf.o 00:04:00.275 CC test/event/scheduler/scheduler.o 00:04:00.275 LINK pmr_persistence 00:04:00.275 LINK hotplug 00:04:00.275 LINK hello_world 00:04:00.275 LINK reactor_perf 00:04:00.275 LINK cmb_copy 00:04:00.275 LINK app_repeat 00:04:00.275 LINK reactor 00:04:00.275 LINK event_perf 00:04:00.275 LINK abort 00:04:00.275 LINK reconnect 00:04:00.275 LINK arbitration 00:04:00.275 LINK scheduler 00:04:00.533 LINK nvme_manage 00:04:00.533 CC test/nvme/aer/aer.o 00:04:00.533 CC test/nvme/reset/reset.o 00:04:00.533 CC test/nvme/err_injection/err_injection.o 00:04:00.533 CC test/nvme/overhead/overhead.o 00:04:00.533 CC test/nvme/sgl/sgl.o 00:04:00.533 CC test/nvme/connect_stress/connect_stress.o 00:04:00.533 CC test/nvme/startup/startup.o 00:04:00.533 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:00.533 CC test/nvme/e2edp/nvme_dp.o 00:04:00.533 CC test/nvme/compliance/nvme_compliance.o 00:04:00.533 CC test/nvme/reserve/reserve.o 00:04:00.533 CC test/nvme/boot_partition/boot_partition.o 00:04:00.533 CC test/nvme/fused_ordering/fused_ordering.o 00:04:00.533 CC test/nvme/cuse/cuse.o 00:04:00.533 CC test/nvme/simple_copy/simple_copy.o 00:04:00.533 CC test/nvme/fdp/fdp.o 00:04:00.533 CC test/blobfs/mkfs/mkfs.o 00:04:00.791 CC test/accel/dif/dif.o 00:04:00.791 CC test/lvol/esnap/esnap.o 00:04:00.791 LINK startup 00:04:00.791 LINK connect_stress 00:04:00.791 LINK doorbell_aers 00:04:00.791 LINK err_injection 00:04:00.791 LINK boot_partition 00:04:00.791 LINK reserve 00:04:00.791 LINK fused_ordering 00:04:00.791 LINK reset 00:04:00.791 LINK aer 00:04:00.791 LINK sgl 00:04:00.791 LINK simple_copy 00:04:00.791 LINK nvme_dp 00:04:00.791 LINK overhead 00:04:00.791 LINK mkfs 00:04:00.791 LINK fdp 00:04:00.791 LINK nvme_compliance 00:04:01.049 LINK dif 00:04:01.307 CC examples/accel/perf/accel_perf.o 00:04:01.307 CC examples/blob/hello_world/hello_blob.o 00:04:01.307 CC examples/blob/cli/blobcli.o 00:04:01.307 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:01.565 LINK cuse 00:04:01.565 LINK hello_blob 00:04:01.565 LINK hello_fsdev 00:04:01.565 LINK accel_perf 00:04:01.565 LINK blobcli 00:04:02.498 CC examples/bdev/hello_world/hello_bdev.o 00:04:02.498 CC examples/bdev/bdevperf/bdevperf.o 00:04:02.498 LINK hello_bdev 00:04:02.498 CC test/bdev/bdevio/bdevio.o 00:04:02.756 LINK bdevperf 00:04:02.756 LINK bdevio 00:04:04.129 LINK esnap 00:04:04.387 CC examples/nvmf/nvmf/nvmf.o 00:04:04.387 LINK nvmf 00:04:05.763 00:04:05.763 real 0m36.165s 00:04:05.763 user 4m38.887s 00:04:05.763 sys 1m39.856s 00:04:05.763 21:29:07 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:04:05.763 21:29:07 make -- common/autotest_common.sh@10 -- $ set +x 00:04:05.763 ************************************ 00:04:05.763 END TEST make 00:04:05.763 ************************************ 00:04:05.763 21:29:07 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:05.763 21:29:07 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:05.763 21:29:07 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:05.763 21:29:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:05.763 21:29:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:05.763 21:29:07 -- pm/common@44 -- $ pid=3177749 00:04:05.763 21:29:07 -- pm/common@50 -- $ kill -TERM 3177749 00:04:05.763 21:29:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:05.763 21:29:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:05.763 21:29:07 -- pm/common@44 -- $ pid=3177751 00:04:05.763 21:29:07 -- pm/common@50 -- $ kill -TERM 3177751 00:04:05.763 21:29:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:05.763 21:29:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:05.763 21:29:07 -- pm/common@44 -- $ pid=3177753 00:04:05.763 21:29:07 -- pm/common@50 -- $ kill -TERM 3177753 00:04:05.763 21:29:07 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:05.763 21:29:07 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:05.763 21:29:07 -- pm/common@44 -- $ pid=3177775 00:04:05.763 21:29:07 -- pm/common@50 -- $ sudo -E kill -TERM 3177775 00:04:05.763 21:29:07 -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:04:05.764 21:29:07 -- common/autotest_common.sh@1689 -- # lcov --version 00:04:05.764 21:29:07 -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:04:06.022 21:29:07 -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:04:06.022 21:29:07 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:06.022 21:29:07 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:06.022 21:29:07 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:06.022 21:29:07 -- scripts/common.sh@336 -- # IFS=.-: 00:04:06.022 21:29:07 -- scripts/common.sh@336 -- # read -ra ver1 00:04:06.022 21:29:07 -- scripts/common.sh@337 -- # IFS=.-: 00:04:06.022 21:29:07 -- scripts/common.sh@337 -- # read -ra ver2 00:04:06.022 21:29:07 -- scripts/common.sh@338 -- # local 'op=<' 00:04:06.022 21:29:07 -- scripts/common.sh@340 -- # ver1_l=2 00:04:06.022 21:29:07 -- scripts/common.sh@341 -- # ver2_l=1 00:04:06.022 21:29:07 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:06.022 21:29:07 -- scripts/common.sh@344 -- # case "$op" in 00:04:06.022 21:29:07 -- scripts/common.sh@345 -- # : 1 00:04:06.022 21:29:07 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:06.022 21:29:07 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:06.022 21:29:07 -- scripts/common.sh@365 -- # decimal 1 00:04:06.022 21:29:07 -- scripts/common.sh@353 -- # local d=1 00:04:06.022 21:29:07 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:06.022 21:29:07 -- scripts/common.sh@355 -- # echo 1 00:04:06.022 21:29:07 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:06.022 21:29:07 -- scripts/common.sh@366 -- # decimal 2 00:04:06.022 21:29:07 -- scripts/common.sh@353 -- # local d=2 00:04:06.022 21:29:07 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:06.022 21:29:07 -- scripts/common.sh@355 -- # echo 2 00:04:06.022 21:29:07 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:06.022 21:29:07 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:06.022 21:29:07 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:06.022 21:29:07 -- scripts/common.sh@368 -- # return 0 00:04:06.022 21:29:07 -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:06.022 21:29:07 -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:04:06.022 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.022 --rc genhtml_branch_coverage=1 00:04:06.022 --rc genhtml_function_coverage=1 00:04:06.022 --rc genhtml_legend=1 00:04:06.022 --rc geninfo_all_blocks=1 00:04:06.022 --rc geninfo_unexecuted_blocks=1 00:04:06.022 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.022 ' 00:04:06.022 21:29:07 -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:04:06.022 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.022 --rc genhtml_branch_coverage=1 00:04:06.022 --rc genhtml_function_coverage=1 00:04:06.022 --rc genhtml_legend=1 00:04:06.022 --rc geninfo_all_blocks=1 00:04:06.022 --rc geninfo_unexecuted_blocks=1 00:04:06.022 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.022 ' 00:04:06.022 21:29:07 -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:04:06.022 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.022 --rc genhtml_branch_coverage=1 00:04:06.022 --rc genhtml_function_coverage=1 00:04:06.022 --rc genhtml_legend=1 00:04:06.022 --rc geninfo_all_blocks=1 00:04:06.022 --rc geninfo_unexecuted_blocks=1 00:04:06.022 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.022 ' 00:04:06.022 21:29:07 -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:04:06.022 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:06.022 --rc genhtml_branch_coverage=1 00:04:06.022 --rc genhtml_function_coverage=1 00:04:06.022 --rc genhtml_legend=1 00:04:06.022 --rc geninfo_all_blocks=1 00:04:06.022 --rc geninfo_unexecuted_blocks=1 00:04:06.022 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:06.022 ' 00:04:06.022 21:29:07 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:06.022 21:29:07 -- nvmf/common.sh@7 -- # uname -s 00:04:06.022 21:29:07 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:06.022 21:29:07 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:06.022 21:29:07 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:06.022 21:29:07 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:06.022 21:29:07 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:06.022 21:29:07 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:06.022 21:29:07 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:06.022 21:29:07 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:06.022 21:29:07 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:06.022 21:29:07 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:06.022 21:29:07 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:06.022 21:29:07 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:06.022 21:29:07 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:06.022 21:29:07 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:06.022 21:29:07 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:06.022 21:29:07 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:06.022 21:29:07 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:06.022 21:29:07 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:06.022 21:29:07 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:06.022 21:29:07 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:06.022 21:29:07 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:06.022 21:29:07 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:06.022 21:29:07 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:06.022 21:29:07 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:06.022 21:29:07 -- paths/export.sh@5 -- # export PATH 00:04:06.022 21:29:07 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:06.022 21:29:07 -- nvmf/common.sh@51 -- # : 0 00:04:06.022 21:29:07 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:06.022 21:29:07 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:06.022 21:29:07 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:06.022 21:29:07 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:06.022 21:29:07 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:06.022 21:29:07 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:06.022 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:06.022 21:29:07 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:06.022 21:29:07 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:06.022 21:29:07 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:06.022 21:29:07 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:06.022 21:29:07 -- spdk/autotest.sh@32 -- # uname -s 00:04:06.022 21:29:07 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:06.022 21:29:07 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:06.022 21:29:07 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:06.022 21:29:07 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:06.022 21:29:07 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:06.022 21:29:07 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:06.022 21:29:07 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:06.022 21:29:07 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:06.022 21:29:07 -- spdk/autotest.sh@48 -- # udevadm_pid=3257193 00:04:06.022 21:29:07 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:06.022 21:29:07 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:06.022 21:29:07 -- pm/common@17 -- # local monitor 00:04:06.022 21:29:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:06.022 21:29:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:06.022 21:29:07 -- pm/common@21 -- # date +%s 00:04:06.022 21:29:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:06.022 21:29:07 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:06.022 21:29:07 -- pm/common@21 -- # date +%s 00:04:06.022 21:29:07 -- pm/common@25 -- # sleep 1 00:04:06.022 21:29:07 -- pm/common@21 -- # date +%s 00:04:06.022 21:29:07 -- pm/common@21 -- # date +%s 00:04:06.022 21:29:07 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1730060947 00:04:06.023 21:29:07 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1730060947 00:04:06.023 21:29:07 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1730060947 00:04:06.023 21:29:07 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1730060947 00:04:06.023 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1730060947_collect-cpu-load.pm.log 00:04:06.023 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1730060947_collect-vmstat.pm.log 00:04:06.023 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1730060947_collect-cpu-temp.pm.log 00:04:06.023 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1730060947_collect-bmc-pm.bmc.pm.log 00:04:06.958 21:29:08 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:06.958 21:29:08 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:06.958 21:29:08 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:06.958 21:29:08 -- common/autotest_common.sh@10 -- # set +x 00:04:06.958 21:29:08 -- spdk/autotest.sh@59 -- # create_test_list 00:04:06.958 21:29:08 -- common/autotest_common.sh@748 -- # xtrace_disable 00:04:06.958 21:29:08 -- common/autotest_common.sh@10 -- # set +x 00:04:07.217 21:29:08 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:07.217 21:29:08 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:07.217 21:29:08 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:07.217 21:29:08 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:07.217 21:29:08 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:07.217 21:29:08 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:07.217 21:29:08 -- common/autotest_common.sh@1453 -- # uname 00:04:07.217 21:29:08 -- common/autotest_common.sh@1453 -- # '[' Linux = FreeBSD ']' 00:04:07.217 21:29:08 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:07.217 21:29:08 -- common/autotest_common.sh@1473 -- # uname 00:04:07.217 21:29:08 -- common/autotest_common.sh@1473 -- # [[ Linux = FreeBSD ]] 00:04:07.217 21:29:08 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:07.217 21:29:08 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:07.217 lcov: LCOV version 1.15 00:04:07.217 21:29:08 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:12.505 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:17.774 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:23.035 21:29:24 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:23.035 21:29:24 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:23.035 21:29:24 -- common/autotest_common.sh@10 -- # set +x 00:04:23.035 21:29:24 -- spdk/autotest.sh@78 -- # rm -f 00:04:23.035 21:29:24 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.316 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:26.316 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:26.573 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:26.573 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:26.573 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:26.573 21:29:28 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:26.573 21:29:28 -- common/autotest_common.sh@1653 -- # zoned_devs=() 00:04:26.573 21:29:28 -- common/autotest_common.sh@1653 -- # local -gA zoned_devs 00:04:26.573 21:29:28 -- common/autotest_common.sh@1654 -- # local nvme bdf 00:04:26.573 21:29:28 -- common/autotest_common.sh@1656 -- # for nvme in /sys/block/nvme* 00:04:26.573 21:29:28 -- common/autotest_common.sh@1657 -- # is_block_zoned nvme0n1 00:04:26.573 21:29:28 -- common/autotest_common.sh@1646 -- # local device=nvme0n1 00:04:26.573 21:29:28 -- common/autotest_common.sh@1648 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:26.573 21:29:28 -- common/autotest_common.sh@1649 -- # [[ none != none ]] 00:04:26.573 21:29:28 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:26.573 21:29:28 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:26.573 21:29:28 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:26.573 21:29:28 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:26.573 21:29:28 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:26.573 21:29:28 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:26.573 No valid GPT data, bailing 00:04:26.573 21:29:28 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:26.573 21:29:28 -- scripts/common.sh@394 -- # pt= 00:04:26.573 21:29:28 -- scripts/common.sh@395 -- # return 1 00:04:26.573 21:29:28 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:26.573 1+0 records in 00:04:26.573 1+0 records out 00:04:26.573 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00233736 s, 449 MB/s 00:04:26.573 21:29:28 -- spdk/autotest.sh@105 -- # sync 00:04:26.573 21:29:28 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:26.573 21:29:28 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:26.573 21:29:28 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:34.682 21:29:35 -- spdk/autotest.sh@111 -- # uname -s 00:04:34.682 21:29:35 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:34.682 21:29:35 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:34.682 21:29:35 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:34.682 21:29:35 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:34.682 21:29:35 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:34.682 21:29:35 -- common/autotest_common.sh@10 -- # set +x 00:04:34.682 ************************************ 00:04:34.682 START TEST setup.sh 00:04:34.682 ************************************ 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:34.682 * Looking for test storage... 00:04:34.682 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1689 -- # lcov --version 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:34.682 21:29:35 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:04:34.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.682 --rc genhtml_branch_coverage=1 00:04:34.682 --rc genhtml_function_coverage=1 00:04:34.682 --rc genhtml_legend=1 00:04:34.682 --rc geninfo_all_blocks=1 00:04:34.682 --rc geninfo_unexecuted_blocks=1 00:04:34.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.682 ' 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:04:34.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.682 --rc genhtml_branch_coverage=1 00:04:34.682 --rc genhtml_function_coverage=1 00:04:34.682 --rc genhtml_legend=1 00:04:34.682 --rc geninfo_all_blocks=1 00:04:34.682 --rc geninfo_unexecuted_blocks=1 00:04:34.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.682 ' 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:04:34.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.682 --rc genhtml_branch_coverage=1 00:04:34.682 --rc genhtml_function_coverage=1 00:04:34.682 --rc genhtml_legend=1 00:04:34.682 --rc geninfo_all_blocks=1 00:04:34.682 --rc geninfo_unexecuted_blocks=1 00:04:34.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.682 ' 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:04:34.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.682 --rc genhtml_branch_coverage=1 00:04:34.682 --rc genhtml_function_coverage=1 00:04:34.682 --rc genhtml_legend=1 00:04:34.682 --rc geninfo_all_blocks=1 00:04:34.682 --rc geninfo_unexecuted_blocks=1 00:04:34.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.682 ' 00:04:34.682 21:29:35 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:34.682 21:29:35 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:34.682 21:29:35 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:34.682 21:29:35 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:34.682 ************************************ 00:04:34.682 START TEST acl 00:04:34.682 ************************************ 00:04:34.682 21:29:35 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:34.682 * Looking for test storage... 00:04:34.682 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:34.682 21:29:35 setup.sh.acl -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:04:34.682 21:29:35 setup.sh.acl -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:04:34.682 21:29:35 setup.sh.acl -- common/autotest_common.sh@1689 -- # lcov --version 00:04:34.682 21:29:36 setup.sh.acl -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:34.682 21:29:36 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:34.682 21:29:36 setup.sh.acl -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:04:34.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.683 --rc genhtml_branch_coverage=1 00:04:34.683 --rc genhtml_function_coverage=1 00:04:34.683 --rc genhtml_legend=1 00:04:34.683 --rc geninfo_all_blocks=1 00:04:34.683 --rc geninfo_unexecuted_blocks=1 00:04:34.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.683 ' 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:04:34.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.683 --rc genhtml_branch_coverage=1 00:04:34.683 --rc genhtml_function_coverage=1 00:04:34.683 --rc genhtml_legend=1 00:04:34.683 --rc geninfo_all_blocks=1 00:04:34.683 --rc geninfo_unexecuted_blocks=1 00:04:34.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.683 ' 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:04:34.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.683 --rc genhtml_branch_coverage=1 00:04:34.683 --rc genhtml_function_coverage=1 00:04:34.683 --rc genhtml_legend=1 00:04:34.683 --rc geninfo_all_blocks=1 00:04:34.683 --rc geninfo_unexecuted_blocks=1 00:04:34.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.683 ' 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:04:34.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.683 --rc genhtml_branch_coverage=1 00:04:34.683 --rc genhtml_function_coverage=1 00:04:34.683 --rc genhtml_legend=1 00:04:34.683 --rc geninfo_all_blocks=1 00:04:34.683 --rc geninfo_unexecuted_blocks=1 00:04:34.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.683 ' 00:04:34.683 21:29:36 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1653 -- # zoned_devs=() 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1653 -- # local -gA zoned_devs 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1654 -- # local nvme bdf 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1656 -- # for nvme in /sys/block/nvme* 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1657 -- # is_block_zoned nvme0n1 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1646 -- # local device=nvme0n1 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1648 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:34.683 21:29:36 setup.sh.acl -- common/autotest_common.sh@1649 -- # [[ none != none ]] 00:04:34.683 21:29:36 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:34.683 21:29:36 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:34.683 21:29:36 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:34.683 21:29:36 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:34.683 21:29:36 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:34.683 21:29:36 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:34.683 21:29:36 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:38.864 21:29:39 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:38.864 21:29:39 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:38.864 21:29:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:38.864 21:29:39 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:38.864 21:29:39 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.864 21:29:39 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:41.394 Hugepages 00:04:41.394 node hugesize free / total 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 00:04:41.394 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.394 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.652 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:41.653 21:29:43 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:41.653 21:29:43 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:41.653 21:29:43 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:41.653 21:29:43 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:41.653 ************************************ 00:04:41.653 START TEST denied 00:04:41.653 ************************************ 00:04:41.653 21:29:43 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:41.653 21:29:43 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:41.653 21:29:43 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:41.653 21:29:43 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:41.653 21:29:43 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.653 21:29:43 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:45.840 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:45.840 21:29:46 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:50.030 00:04:50.030 real 0m8.265s 00:04:50.030 user 0m2.655s 00:04:50.030 sys 0m4.978s 00:04:50.030 21:29:51 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:50.030 21:29:51 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:50.030 ************************************ 00:04:50.030 END TEST denied 00:04:50.030 ************************************ 00:04:50.030 21:29:51 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:50.030 21:29:51 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:50.030 21:29:51 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:50.030 21:29:51 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:50.030 ************************************ 00:04:50.030 START TEST allowed 00:04:50.030 ************************************ 00:04:50.030 21:29:51 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:04:50.030 21:29:51 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:50.030 21:29:51 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:50.030 21:29:51 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:50.030 21:29:51 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.030 21:29:51 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:55.299 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:55.299 21:29:56 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:55.299 21:29:56 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:55.299 21:29:56 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:55.299 21:29:56 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:55.299 21:29:56 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:58.589 00:04:58.589 real 0m8.045s 00:04:58.589 user 0m2.091s 00:04:58.589 sys 0m4.519s 00:04:58.589 21:29:59 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:58.589 21:29:59 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:58.589 ************************************ 00:04:58.589 END TEST allowed 00:04:58.589 ************************************ 00:04:58.589 00:04:58.589 real 0m23.890s 00:04:58.589 user 0m7.459s 00:04:58.589 sys 0m14.620s 00:04:58.589 21:29:59 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:58.589 21:29:59 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:58.589 ************************************ 00:04:58.589 END TEST acl 00:04:58.589 ************************************ 00:04:58.589 21:29:59 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:58.589 21:29:59 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:58.589 21:29:59 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:58.589 21:29:59 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:58.589 ************************************ 00:04:58.589 START TEST hugepages 00:04:58.589 ************************************ 00:04:58.589 21:29:59 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:58.589 * Looking for test storage... 00:04:58.589 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:58.589 21:29:59 setup.sh.hugepages -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:04:58.589 21:29:59 setup.sh.hugepages -- common/autotest_common.sh@1689 -- # lcov --version 00:04:58.589 21:29:59 setup.sh.hugepages -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:04:58.589 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:58.589 21:30:00 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:58.589 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:58.589 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:04:58.589 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:58.589 --rc genhtml_branch_coverage=1 00:04:58.589 --rc genhtml_function_coverage=1 00:04:58.589 --rc genhtml_legend=1 00:04:58.589 --rc geninfo_all_blocks=1 00:04:58.589 --rc geninfo_unexecuted_blocks=1 00:04:58.589 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:58.589 ' 00:04:58.589 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:04:58.589 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:58.589 --rc genhtml_branch_coverage=1 00:04:58.589 --rc genhtml_function_coverage=1 00:04:58.589 --rc genhtml_legend=1 00:04:58.589 --rc geninfo_all_blocks=1 00:04:58.589 --rc geninfo_unexecuted_blocks=1 00:04:58.589 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:58.589 ' 00:04:58.589 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:04:58.589 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:58.589 --rc genhtml_branch_coverage=1 00:04:58.589 --rc genhtml_function_coverage=1 00:04:58.589 --rc genhtml_legend=1 00:04:58.589 --rc geninfo_all_blocks=1 00:04:58.589 --rc geninfo_unexecuted_blocks=1 00:04:58.589 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:58.590 ' 00:04:58.590 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:04:58.590 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:58.590 --rc genhtml_branch_coverage=1 00:04:58.590 --rc genhtml_function_coverage=1 00:04:58.590 --rc genhtml_legend=1 00:04:58.590 --rc geninfo_all_blocks=1 00:04:58.590 --rc geninfo_unexecuted_blocks=1 00:04:58.590 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:58.590 ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 37901016 kB' 'MemAvailable: 41360264 kB' 'Buffers: 4304 kB' 'Cached: 13844876 kB' 'SwapCached: 40 kB' 'Active: 11141960 kB' 'Inactive: 3310396 kB' 'Active(anon): 10630336 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 607376 kB' 'Mapped: 176368 kB' 'Shmem: 10035196 kB' 'KReclaimable: 284260 kB' 'Slab: 1251712 kB' 'SReclaimable: 284260 kB' 'SUnreclaim: 967452 kB' 'KernelStack: 22016 kB' 'PageTables: 8936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433352 kB' 'Committed_AS: 11879640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217008 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.590 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.591 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:58.592 21:30:00 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:58.592 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:58.592 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:58.592 21:30:00 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:58.592 ************************************ 00:04:58.592 START TEST single_node_setup 00:04:58.592 ************************************ 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1125 -- # single_node_setup 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.592 21:30:00 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:01.872 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:01.872 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:03.254 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40000808 kB' 'MemAvailable: 43460068 kB' 'Buffers: 4304 kB' 'Cached: 13845004 kB' 'SwapCached: 40 kB' 'Active: 11144172 kB' 'Inactive: 3310396 kB' 'Active(anon): 10632548 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 608400 kB' 'Mapped: 176088 kB' 'Shmem: 10035324 kB' 'KReclaimable: 284284 kB' 'Slab: 1250016 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965732 kB' 'KernelStack: 21920 kB' 'PageTables: 8344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11883612 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217156 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.254 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.255 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40009936 kB' 'MemAvailable: 43469196 kB' 'Buffers: 4304 kB' 'Cached: 13845008 kB' 'SwapCached: 40 kB' 'Active: 11139688 kB' 'Inactive: 3310396 kB' 'Active(anon): 10628064 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604084 kB' 'Mapped: 175912 kB' 'Shmem: 10035328 kB' 'KReclaimable: 284284 kB' 'Slab: 1249972 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965688 kB' 'KernelStack: 22016 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11878532 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217184 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.256 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.257 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40017780 kB' 'MemAvailable: 43477040 kB' 'Buffers: 4304 kB' 'Cached: 13845020 kB' 'SwapCached: 40 kB' 'Active: 11139804 kB' 'Inactive: 3310396 kB' 'Active(anon): 10628180 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604280 kB' 'Mapped: 175396 kB' 'Shmem: 10035340 kB' 'KReclaimable: 284284 kB' 'Slab: 1249948 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965664 kB' 'KernelStack: 22192 kB' 'PageTables: 9048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11883652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217248 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.258 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.259 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:03.260 nr_hugepages=1024 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:03.260 resv_hugepages=0 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:03.260 surplus_hugepages=0 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:03.260 anon_hugepages=0 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40017384 kB' 'MemAvailable: 43476644 kB' 'Buffers: 4304 kB' 'Cached: 13845020 kB' 'SwapCached: 40 kB' 'Active: 11140792 kB' 'Inactive: 3310396 kB' 'Active(anon): 10629168 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604668 kB' 'Mapped: 175656 kB' 'Shmem: 10035340 kB' 'KReclaimable: 284284 kB' 'Slab: 1249900 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965616 kB' 'KernelStack: 22032 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11881468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217152 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.260 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:03.261 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 17564672 kB' 'MemUsed: 15020696 kB' 'SwapCached: 8 kB' 'Active: 8458012 kB' 'Inactive: 3089056 kB' 'Active(anon): 8124204 kB' 'Inactive(anon): 7988 kB' 'Active(file): 333808 kB' 'Inactive(file): 3081068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11079788 kB' 'Mapped: 95456 kB' 'AnonPages: 470508 kB' 'Shmem: 7664904 kB' 'KernelStack: 12408 kB' 'PageTables: 5248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 176184 kB' 'Slab: 658416 kB' 'SReclaimable: 176184 kB' 'SUnreclaim: 482232 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.262 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:03.263 node0=1024 expecting 1024 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:03.263 00:05:03.263 real 0m4.776s 00:05:03.263 user 0m1.141s 00:05:03.263 sys 0m1.928s 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:03.263 21:30:04 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:03.263 ************************************ 00:05:03.263 END TEST single_node_setup 00:05:03.263 ************************************ 00:05:03.263 21:30:04 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:03.263 21:30:04 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:03.263 21:30:04 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:03.263 21:30:04 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:03.521 ************************************ 00:05:03.521 START TEST even_2G_alloc 00:05:03.521 ************************************ 00:05:03.521 21:30:05 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:05:03.521 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:03.521 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:03.521 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.522 21:30:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:06.816 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:06.816 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40019208 kB' 'MemAvailable: 43478468 kB' 'Buffers: 4304 kB' 'Cached: 13845164 kB' 'SwapCached: 40 kB' 'Active: 11137064 kB' 'Inactive: 3310396 kB' 'Active(anon): 10625440 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601284 kB' 'Mapped: 174408 kB' 'Shmem: 10035484 kB' 'KReclaimable: 284284 kB' 'Slab: 1249828 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965544 kB' 'KernelStack: 22144 kB' 'PageTables: 8560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11869812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217248 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.816 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.817 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40019320 kB' 'MemAvailable: 43478580 kB' 'Buffers: 4304 kB' 'Cached: 13845168 kB' 'SwapCached: 40 kB' 'Active: 11137156 kB' 'Inactive: 3310396 kB' 'Active(anon): 10625532 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601308 kB' 'Mapped: 174384 kB' 'Shmem: 10035488 kB' 'KReclaimable: 284284 kB' 'Slab: 1249792 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965508 kB' 'KernelStack: 22016 kB' 'PageTables: 8736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11869828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217088 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.818 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.819 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40022836 kB' 'MemAvailable: 43482096 kB' 'Buffers: 4304 kB' 'Cached: 13845188 kB' 'SwapCached: 40 kB' 'Active: 11136744 kB' 'Inactive: 3310396 kB' 'Active(anon): 10625120 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601348 kB' 'Mapped: 174384 kB' 'Shmem: 10035508 kB' 'KReclaimable: 284284 kB' 'Slab: 1250036 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965752 kB' 'KernelStack: 21936 kB' 'PageTables: 8392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11869852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217056 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.820 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:06.821 nr_hugepages=1024 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:06.821 resv_hugepages=0 00:05:06.821 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:06.821 surplus_hugepages=0 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:06.822 anon_hugepages=0 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40022964 kB' 'MemAvailable: 43482224 kB' 'Buffers: 4304 kB' 'Cached: 13845204 kB' 'SwapCached: 40 kB' 'Active: 11137188 kB' 'Inactive: 3310396 kB' 'Active(anon): 10625564 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601264 kB' 'Mapped: 174384 kB' 'Shmem: 10035524 kB' 'KReclaimable: 284284 kB' 'Slab: 1250036 kB' 'SReclaimable: 284284 kB' 'SUnreclaim: 965752 kB' 'KernelStack: 21952 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11869872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217088 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.822 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.823 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18629008 kB' 'MemUsed: 13956360 kB' 'SwapCached: 8 kB' 'Active: 8453660 kB' 'Inactive: 3089056 kB' 'Active(anon): 8119852 kB' 'Inactive(anon): 7988 kB' 'Active(file): 333808 kB' 'Inactive(file): 3081068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11079932 kB' 'Mapped: 94176 kB' 'AnonPages: 465996 kB' 'Shmem: 7665048 kB' 'KernelStack: 12376 kB' 'PageTables: 5192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 176184 kB' 'Slab: 658448 kB' 'SReclaimable: 176184 kB' 'SUnreclaim: 482264 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.824 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698432 kB' 'MemFree: 21394844 kB' 'MemUsed: 6303588 kB' 'SwapCached: 32 kB' 'Active: 2683236 kB' 'Inactive: 221340 kB' 'Active(anon): 2505420 kB' 'Inactive(anon): 48 kB' 'Active(file): 177816 kB' 'Inactive(file): 221292 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2769648 kB' 'Mapped: 80208 kB' 'AnonPages: 135032 kB' 'Shmem: 2370508 kB' 'KernelStack: 9448 kB' 'PageTables: 3000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108100 kB' 'Slab: 591588 kB' 'SReclaimable: 108100 kB' 'SUnreclaim: 483488 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.825 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:06.826 node0=512 expecting 512 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:06.826 node1=512 expecting 512 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:06.826 00:05:06.826 real 0m3.335s 00:05:06.826 user 0m1.249s 00:05:06.826 sys 0m2.095s 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:06.826 21:30:08 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:06.826 ************************************ 00:05:06.826 END TEST even_2G_alloc 00:05:06.826 ************************************ 00:05:06.826 21:30:08 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:06.826 21:30:08 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:06.826 21:30:08 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:06.826 21:30:08 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:06.826 ************************************ 00:05:06.826 START TEST odd_alloc 00:05:06.826 ************************************ 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.826 21:30:08 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:10.117 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.117 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40065340 kB' 'MemAvailable: 43524592 kB' 'Buffers: 4304 kB' 'Cached: 13845340 kB' 'SwapCached: 40 kB' 'Active: 11136924 kB' 'Inactive: 3310396 kB' 'Active(anon): 10625300 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600960 kB' 'Mapped: 174952 kB' 'Shmem: 10035660 kB' 'KReclaimable: 284268 kB' 'Slab: 1249664 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965396 kB' 'KernelStack: 21808 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480904 kB' 'Committed_AS: 11868964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217008 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.381 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.382 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40057728 kB' 'MemAvailable: 43516980 kB' 'Buffers: 4304 kB' 'Cached: 13845340 kB' 'SwapCached: 40 kB' 'Active: 11141928 kB' 'Inactive: 3310396 kB' 'Active(anon): 10630304 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 605944 kB' 'Mapped: 174896 kB' 'Shmem: 10035660 kB' 'KReclaimable: 284268 kB' 'Slab: 1249660 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965392 kB' 'KernelStack: 21840 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480904 kB' 'Committed_AS: 11874008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216996 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.383 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:11 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40061252 kB' 'MemAvailable: 43520504 kB' 'Buffers: 4304 kB' 'Cached: 13845360 kB' 'SwapCached: 40 kB' 'Active: 11138016 kB' 'Inactive: 3310396 kB' 'Active(anon): 10626392 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601980 kB' 'Mapped: 174896 kB' 'Shmem: 10035680 kB' 'KReclaimable: 284268 kB' 'Slab: 1249660 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965392 kB' 'KernelStack: 21840 kB' 'PageTables: 8156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480904 kB' 'Committed_AS: 11870056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216992 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.384 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.385 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:10.386 nr_hugepages=1025 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:10.386 resv_hugepages=0 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:10.386 surplus_hugepages=0 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:10.386 anon_hugepages=0 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40053736 kB' 'MemAvailable: 43512988 kB' 'Buffers: 4304 kB' 'Cached: 13845380 kB' 'SwapCached: 40 kB' 'Active: 11142008 kB' 'Inactive: 3310396 kB' 'Active(anon): 10630384 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 605976 kB' 'Mapped: 175244 kB' 'Shmem: 10035700 kB' 'KReclaimable: 284268 kB' 'Slab: 1249660 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965392 kB' 'KernelStack: 21840 kB' 'PageTables: 8188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480904 kB' 'Committed_AS: 11873848 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217012 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.386 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.387 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18638272 kB' 'MemUsed: 13947096 kB' 'SwapCached: 8 kB' 'Active: 8453816 kB' 'Inactive: 3089056 kB' 'Active(anon): 8120008 kB' 'Inactive(anon): 7988 kB' 'Active(file): 333808 kB' 'Inactive(file): 3081068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11080032 kB' 'Mapped: 94184 kB' 'AnonPages: 466188 kB' 'Shmem: 7665148 kB' 'KernelStack: 12408 kB' 'PageTables: 5340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 176168 kB' 'Slab: 658140 kB' 'SReclaimable: 176168 kB' 'SUnreclaim: 481972 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.388 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698432 kB' 'MemFree: 21423996 kB' 'MemUsed: 6274436 kB' 'SwapCached: 32 kB' 'Active: 2683108 kB' 'Inactive: 221340 kB' 'Active(anon): 2505292 kB' 'Inactive(anon): 48 kB' 'Active(file): 177816 kB' 'Inactive(file): 221292 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2769716 kB' 'Mapped: 80208 kB' 'AnonPages: 134816 kB' 'Shmem: 2370576 kB' 'KernelStack: 9480 kB' 'PageTables: 2984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108100 kB' 'Slab: 591520 kB' 'SReclaimable: 108100 kB' 'SUnreclaim: 483420 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.389 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.649 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:10.650 node0=513 expecting 513 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:10.650 node1=512 expecting 512 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:10.650 00:05:10.650 real 0m3.708s 00:05:10.650 user 0m1.407s 00:05:10.650 sys 0m2.364s 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:10.650 21:30:12 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:10.650 ************************************ 00:05:10.650 END TEST odd_alloc 00:05:10.650 ************************************ 00:05:10.650 21:30:12 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:10.650 21:30:12 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:10.650 21:30:12 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:10.650 21:30:12 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:10.650 ************************************ 00:05:10.650 START TEST custom_alloc 00:05:10.650 ************************************ 00:05:10.650 21:30:12 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:05:10.650 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:10.650 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:10.650 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.651 21:30:12 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:14.101 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:14.101 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 39054716 kB' 'MemAvailable: 42513968 kB' 'Buffers: 4304 kB' 'Cached: 13845508 kB' 'SwapCached: 40 kB' 'Active: 11139488 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627864 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 603216 kB' 'Mapped: 174984 kB' 'Shmem: 10035828 kB' 'KReclaimable: 284268 kB' 'Slab: 1249948 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965680 kB' 'KernelStack: 21840 kB' 'PageTables: 8168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957640 kB' 'Committed_AS: 11871368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216944 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.101 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 39050368 kB' 'MemAvailable: 42509620 kB' 'Buffers: 4304 kB' 'Cached: 13845512 kB' 'SwapCached: 40 kB' 'Active: 11137364 kB' 'Inactive: 3310396 kB' 'Active(anon): 10625740 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601152 kB' 'Mapped: 174828 kB' 'Shmem: 10035832 kB' 'KReclaimable: 284268 kB' 'Slab: 1250016 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965748 kB' 'KernelStack: 21856 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957640 kB' 'Committed_AS: 11869596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216928 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.102 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.103 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.104 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 39048976 kB' 'MemAvailable: 42508228 kB' 'Buffers: 4304 kB' 'Cached: 13845516 kB' 'SwapCached: 40 kB' 'Active: 11138348 kB' 'Inactive: 3310396 kB' 'Active(anon): 10626724 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602112 kB' 'Mapped: 174916 kB' 'Shmem: 10035836 kB' 'KReclaimable: 284268 kB' 'Slab: 1250016 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965748 kB' 'KernelStack: 21840 kB' 'PageTables: 8168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957640 kB' 'Committed_AS: 11870744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216912 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.105 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.106 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:14.107 nr_hugepages=1536 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:14.107 resv_hugepages=0 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:14.107 surplus_hugepages=0 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:14.107 anon_hugepages=0 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 39042776 kB' 'MemAvailable: 42502028 kB' 'Buffers: 4304 kB' 'Cached: 13845552 kB' 'SwapCached: 40 kB' 'Active: 11143148 kB' 'Inactive: 3310396 kB' 'Active(anon): 10631524 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 606912 kB' 'Mapped: 175288 kB' 'Shmem: 10035872 kB' 'KReclaimable: 284268 kB' 'Slab: 1250016 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965748 kB' 'KernelStack: 21840 kB' 'PageTables: 8200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957640 kB' 'Committed_AS: 11875872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216900 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.107 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.108 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18637768 kB' 'MemUsed: 13947600 kB' 'SwapCached: 8 kB' 'Active: 8453248 kB' 'Inactive: 3089056 kB' 'Active(anon): 8119440 kB' 'Inactive(anon): 7988 kB' 'Active(file): 333808 kB' 'Inactive(file): 3081068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11080056 kB' 'Mapped: 94204 kB' 'AnonPages: 465368 kB' 'Shmem: 7665172 kB' 'KernelStack: 12328 kB' 'PageTables: 5036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 176168 kB' 'Slab: 658320 kB' 'SReclaimable: 176168 kB' 'SUnreclaim: 482152 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.109 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.110 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698432 kB' 'MemFree: 20407492 kB' 'MemUsed: 7290940 kB' 'SwapCached: 32 kB' 'Active: 2684028 kB' 'Inactive: 221340 kB' 'Active(anon): 2506212 kB' 'Inactive(anon): 48 kB' 'Active(file): 177816 kB' 'Inactive(file): 221292 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2769860 kB' 'Mapped: 80208 kB' 'AnonPages: 135588 kB' 'Shmem: 2370720 kB' 'KernelStack: 9592 kB' 'PageTables: 3004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 108100 kB' 'Slab: 591696 kB' 'SReclaimable: 108100 kB' 'SUnreclaim: 483596 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.111 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:14.112 node0=512 expecting 512 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:14.112 node1=1024 expecting 1024 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:14.112 00:05:14.112 real 0m3.334s 00:05:14.112 user 0m1.147s 00:05:14.112 sys 0m2.209s 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:14.112 21:30:15 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:14.112 ************************************ 00:05:14.112 END TEST custom_alloc 00:05:14.112 ************************************ 00:05:14.112 21:30:15 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:14.112 21:30:15 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:14.112 21:30:15 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:14.112 21:30:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:14.112 ************************************ 00:05:14.112 START TEST no_shrink_alloc 00:05:14.112 ************************************ 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.112 21:30:15 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:17.402 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.402 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:17.402 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40119260 kB' 'MemAvailable: 43578512 kB' 'Buffers: 4304 kB' 'Cached: 13845676 kB' 'SwapCached: 40 kB' 'Active: 11139220 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627596 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602808 kB' 'Mapped: 174564 kB' 'Shmem: 10035996 kB' 'KReclaimable: 284268 kB' 'Slab: 1249536 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965268 kB' 'KernelStack: 22000 kB' 'PageTables: 8316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11871500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217152 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.403 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.404 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40118004 kB' 'MemAvailable: 43577256 kB' 'Buffers: 4304 kB' 'Cached: 13845680 kB' 'SwapCached: 40 kB' 'Active: 11139132 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627508 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602792 kB' 'Mapped: 174416 kB' 'Shmem: 10036000 kB' 'KReclaimable: 284268 kB' 'Slab: 1249400 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965132 kB' 'KernelStack: 21920 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11871524 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217168 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.405 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.406 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40116316 kB' 'MemAvailable: 43575568 kB' 'Buffers: 4304 kB' 'Cached: 13845696 kB' 'SwapCached: 40 kB' 'Active: 11139232 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627608 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602764 kB' 'Mapped: 174416 kB' 'Shmem: 10036016 kB' 'KReclaimable: 284268 kB' 'Slab: 1249400 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965132 kB' 'KernelStack: 21936 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11871676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217136 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.407 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.408 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:17.409 nr_hugepages=1024 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:17.409 resv_hugepages=0 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:17.409 surplus_hugepages=0 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:17.409 anon_hugepages=0 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.409 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40116144 kB' 'MemAvailable: 43575396 kB' 'Buffers: 4304 kB' 'Cached: 13845732 kB' 'SwapCached: 40 kB' 'Active: 11138948 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627324 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602888 kB' 'Mapped: 174416 kB' 'Shmem: 10036052 kB' 'KReclaimable: 284268 kB' 'Slab: 1249400 kB' 'SReclaimable: 284268 kB' 'SUnreclaim: 965132 kB' 'KernelStack: 22000 kB' 'PageTables: 8252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11872072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217168 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.410 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.411 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 17615380 kB' 'MemUsed: 14969988 kB' 'SwapCached: 8 kB' 'Active: 8455180 kB' 'Inactive: 3089056 kB' 'Active(anon): 8121372 kB' 'Inactive(anon): 7988 kB' 'Active(file): 333808 kB' 'Inactive(file): 3081068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11080168 kB' 'Mapped: 94208 kB' 'AnonPages: 467212 kB' 'Shmem: 7665284 kB' 'KernelStack: 12344 kB' 'PageTables: 5040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 176168 kB' 'Slab: 657764 kB' 'SReclaimable: 176168 kB' 'SUnreclaim: 481596 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.412 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.413 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.414 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.414 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.414 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:17.414 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.414 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:17.672 node0=1024 expecting 1024 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.672 21:30:19 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:20.204 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:20.204 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:20.204 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40120828 kB' 'MemAvailable: 43580076 kB' 'Buffers: 4304 kB' 'Cached: 13845828 kB' 'SwapCached: 40 kB' 'Active: 11139200 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627576 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602796 kB' 'Mapped: 174504 kB' 'Shmem: 10036148 kB' 'KReclaimable: 284260 kB' 'Slab: 1249672 kB' 'SReclaimable: 284260 kB' 'SUnreclaim: 965412 kB' 'KernelStack: 22032 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11872748 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217088 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.204 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.205 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40124284 kB' 'MemAvailable: 43583532 kB' 'Buffers: 4304 kB' 'Cached: 13845832 kB' 'SwapCached: 40 kB' 'Active: 11138684 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627060 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602796 kB' 'Mapped: 174428 kB' 'Shmem: 10036152 kB' 'KReclaimable: 284260 kB' 'Slab: 1249600 kB' 'SReclaimable: 284260 kB' 'SUnreclaim: 965340 kB' 'KernelStack: 22016 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11872768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217008 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.206 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.468 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.468 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.468 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.469 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40123808 kB' 'MemAvailable: 43583056 kB' 'Buffers: 4304 kB' 'Cached: 13845848 kB' 'SwapCached: 40 kB' 'Active: 11139304 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627680 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602840 kB' 'Mapped: 174428 kB' 'Shmem: 10036168 kB' 'KReclaimable: 284260 kB' 'Slab: 1249664 kB' 'SReclaimable: 284260 kB' 'SUnreclaim: 965404 kB' 'KernelStack: 21920 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11872788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217056 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.470 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.471 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:20.472 nr_hugepages=1024 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:20.472 resv_hugepages=0 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:20.472 surplus_hugepages=0 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:20.472 anon_hugepages=0 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:20.472 21:30:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283800 kB' 'MemFree: 40124404 kB' 'MemAvailable: 43583652 kB' 'Buffers: 4304 kB' 'Cached: 13845872 kB' 'SwapCached: 40 kB' 'Active: 11138988 kB' 'Inactive: 3310396 kB' 'Active(anon): 10627364 kB' 'Inactive(anon): 8036 kB' 'Active(file): 511624 kB' 'Inactive(file): 3302360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8356092 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602504 kB' 'Mapped: 174428 kB' 'Shmem: 10036192 kB' 'KReclaimable: 284260 kB' 'Slab: 1249664 kB' 'SReclaimable: 284260 kB' 'SUnreclaim: 965404 kB' 'KernelStack: 21824 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481928 kB' 'Committed_AS: 11871208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 216928 kB' 'VmallocChunk: 0 kB' 'Percpu: 84672 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3216756 kB' 'DirectMap2M: 47849472 kB' 'DirectMap1G: 18874368 kB' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.472 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.473 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 17633576 kB' 'MemUsed: 14951792 kB' 'SwapCached: 8 kB' 'Active: 8452980 kB' 'Inactive: 3089056 kB' 'Active(anon): 8119172 kB' 'Inactive(anon): 7988 kB' 'Active(file): 333808 kB' 'Inactive(file): 3081068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11080268 kB' 'Mapped: 94220 kB' 'AnonPages: 464960 kB' 'Shmem: 7665384 kB' 'KernelStack: 12328 kB' 'PageTables: 5040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 176160 kB' 'Slab: 658140 kB' 'SReclaimable: 176160 kB' 'SUnreclaim: 481980 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.474 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:20.475 node0=1024 expecting 1024 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:20.475 00:05:20.475 real 0m6.435s 00:05:20.475 user 0m2.279s 00:05:20.475 sys 0m4.186s 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:20.475 21:30:22 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:20.475 ************************************ 00:05:20.475 END TEST no_shrink_alloc 00:05:20.475 ************************************ 00:05:20.475 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:20.475 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:20.475 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:20.475 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:20.475 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:20.476 21:30:22 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:20.476 00:05:20.476 real 0m22.265s 00:05:20.476 user 0m7.522s 00:05:20.476 sys 0m13.212s 00:05:20.476 21:30:22 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:20.476 21:30:22 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:20.476 ************************************ 00:05:20.476 END TEST hugepages 00:05:20.476 ************************************ 00:05:20.476 21:30:22 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:20.476 21:30:22 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:20.476 21:30:22 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:20.476 21:30:22 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:20.735 ************************************ 00:05:20.735 START TEST driver 00:05:20.735 ************************************ 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:20.735 * Looking for test storage... 00:05:20.735 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1689 -- # lcov --version 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:20.735 21:30:22 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:05:20.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:20.735 --rc genhtml_branch_coverage=1 00:05:20.735 --rc genhtml_function_coverage=1 00:05:20.735 --rc genhtml_legend=1 00:05:20.735 --rc geninfo_all_blocks=1 00:05:20.735 --rc geninfo_unexecuted_blocks=1 00:05:20.735 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:20.735 ' 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:05:20.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:20.735 --rc genhtml_branch_coverage=1 00:05:20.735 --rc genhtml_function_coverage=1 00:05:20.735 --rc genhtml_legend=1 00:05:20.735 --rc geninfo_all_blocks=1 00:05:20.735 --rc geninfo_unexecuted_blocks=1 00:05:20.735 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:20.735 ' 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:05:20.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:20.735 --rc genhtml_branch_coverage=1 00:05:20.735 --rc genhtml_function_coverage=1 00:05:20.735 --rc genhtml_legend=1 00:05:20.735 --rc geninfo_all_blocks=1 00:05:20.735 --rc geninfo_unexecuted_blocks=1 00:05:20.735 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:20.735 ' 00:05:20.735 21:30:22 setup.sh.driver -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:05:20.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:20.735 --rc genhtml_branch_coverage=1 00:05:20.735 --rc genhtml_function_coverage=1 00:05:20.735 --rc genhtml_legend=1 00:05:20.735 --rc geninfo_all_blocks=1 00:05:20.735 --rc geninfo_unexecuted_blocks=1 00:05:20.735 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:20.735 ' 00:05:20.735 21:30:22 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:20.735 21:30:22 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:20.735 21:30:22 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:26.004 21:30:27 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:26.004 21:30:27 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:26.004 21:30:27 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:26.004 21:30:27 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:26.004 ************************************ 00:05:26.004 START TEST guess_driver 00:05:26.004 ************************************ 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:26.004 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:26.004 Looking for driver=vfio-pci 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:26.004 21:30:27 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.536 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.795 21:30:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:30.697 21:30:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:30.697 21:30:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:30.697 21:30:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:30.697 21:30:31 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:30.697 21:30:31 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:30.697 21:30:31 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:30.698 21:30:31 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:34.878 00:05:34.878 real 0m9.224s 00:05:34.878 user 0m2.283s 00:05:34.878 sys 0m4.553s 00:05:34.878 21:30:36 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:34.878 21:30:36 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:34.878 ************************************ 00:05:34.878 END TEST guess_driver 00:05:34.878 ************************************ 00:05:34.878 00:05:34.878 real 0m14.110s 00:05:34.878 user 0m3.651s 00:05:34.878 sys 0m7.304s 00:05:34.878 21:30:36 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:34.878 21:30:36 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:34.878 ************************************ 00:05:34.878 END TEST driver 00:05:34.878 ************************************ 00:05:34.878 21:30:36 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:34.878 21:30:36 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:34.878 21:30:36 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:34.878 21:30:36 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:34.878 ************************************ 00:05:34.878 START TEST devices 00:05:34.878 ************************************ 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:34.878 * Looking for test storage... 00:05:34.878 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1689 -- # lcov --version 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:34.878 21:30:36 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:34.878 21:30:36 setup.sh.devices -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:05:34.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.879 --rc genhtml_branch_coverage=1 00:05:34.879 --rc genhtml_function_coverage=1 00:05:34.879 --rc genhtml_legend=1 00:05:34.879 --rc geninfo_all_blocks=1 00:05:34.879 --rc geninfo_unexecuted_blocks=1 00:05:34.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.879 ' 00:05:34.879 21:30:36 setup.sh.devices -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:05:34.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.879 --rc genhtml_branch_coverage=1 00:05:34.879 --rc genhtml_function_coverage=1 00:05:34.879 --rc genhtml_legend=1 00:05:34.879 --rc geninfo_all_blocks=1 00:05:34.879 --rc geninfo_unexecuted_blocks=1 00:05:34.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.879 ' 00:05:34.879 21:30:36 setup.sh.devices -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:05:34.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.879 --rc genhtml_branch_coverage=1 00:05:34.879 --rc genhtml_function_coverage=1 00:05:34.879 --rc genhtml_legend=1 00:05:34.879 --rc geninfo_all_blocks=1 00:05:34.879 --rc geninfo_unexecuted_blocks=1 00:05:34.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.879 ' 00:05:34.879 21:30:36 setup.sh.devices -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:05:34.879 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.879 --rc genhtml_branch_coverage=1 00:05:34.879 --rc genhtml_function_coverage=1 00:05:34.879 --rc genhtml_legend=1 00:05:34.879 --rc geninfo_all_blocks=1 00:05:34.879 --rc geninfo_unexecuted_blocks=1 00:05:34.879 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.879 ' 00:05:34.879 21:30:36 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:34.879 21:30:36 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:34.879 21:30:36 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:34.879 21:30:36 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1653 -- # zoned_devs=() 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1653 -- # local -gA zoned_devs 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1654 -- # local nvme bdf 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1656 -- # for nvme in /sys/block/nvme* 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1657 -- # is_block_zoned nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1646 -- # local device=nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1648 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1649 -- # [[ none != none ]] 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:39.062 21:30:40 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:39.062 No valid GPT data, bailing 00:05:39.062 21:30:40 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:39.062 21:30:40 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:39.062 21:30:40 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:39.062 21:30:40 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:39.062 21:30:40 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:39.062 ************************************ 00:05:39.062 START TEST nvme_mount 00:05:39.062 ************************************ 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:39.062 21:30:40 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:39.999 Creating new GPT entries in memory. 00:05:39.999 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:39.999 other utilities. 00:05:39.999 21:30:41 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:39.999 21:30:41 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:39.999 21:30:41 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:39.999 21:30:41 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:39.999 21:30:41 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:40.935 Creating new GPT entries in memory. 00:05:40.935 The operation has completed successfully. 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 3289356 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:40.935 21:30:42 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.213 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:44.214 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:44.214 21:30:45 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:44.472 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:44.472 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:44.472 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:44.472 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:44.472 21:30:46 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:47.756 21:30:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:47.756 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:47.757 21:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.288 21:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.546 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:50.547 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:50.547 00:05:50.547 real 0m11.891s 00:05:50.547 user 0m3.293s 00:05:50.547 sys 0m6.439s 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:50.547 21:30:52 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:50.547 ************************************ 00:05:50.547 END TEST nvme_mount 00:05:50.547 ************************************ 00:05:50.805 21:30:52 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:50.805 21:30:52 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:50.805 21:30:52 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:50.805 21:30:52 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:50.805 ************************************ 00:05:50.805 START TEST dm_mount 00:05:50.805 ************************************ 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:50.805 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:50.806 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:50.806 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:50.806 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:50.806 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:50.806 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:50.806 21:30:52 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:51.744 Creating new GPT entries in memory. 00:05:51.744 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:51.744 other utilities. 00:05:51.744 21:30:53 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:51.744 21:30:53 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:51.744 21:30:53 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:51.744 21:30:53 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:51.744 21:30:53 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:52.682 Creating new GPT entries in memory. 00:05:52.682 The operation has completed successfully. 00:05:52.682 21:30:54 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:52.682 21:30:54 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:52.682 21:30:54 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:52.682 21:30:54 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:52.682 21:30:54 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:54.061 The operation has completed successfully. 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 3293550 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:54.061 21:30:55 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:57.348 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:57.349 21:30:58 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:00.634 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:00.634 00:06:00.634 real 0m9.640s 00:06:00.634 user 0m2.248s 00:06:00.634 sys 0m4.412s 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:00.634 21:31:01 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:00.634 ************************************ 00:06:00.634 END TEST dm_mount 00:06:00.634 ************************************ 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:00.634 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:00.634 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:00.634 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:00.634 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:00.634 21:31:02 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:00.635 21:31:02 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:00.635 21:31:02 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:00.635 00:06:00.635 real 0m25.905s 00:06:00.635 user 0m7.025s 00:06:00.635 sys 0m13.598s 00:06:00.635 21:31:02 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:00.635 21:31:02 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:00.635 ************************************ 00:06:00.635 END TEST devices 00:06:00.635 ************************************ 00:06:00.635 00:06:00.635 real 1m26.679s 00:06:00.635 user 0m25.871s 00:06:00.635 sys 0m49.072s 00:06:00.635 21:31:02 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:00.635 21:31:02 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:00.635 ************************************ 00:06:00.635 END TEST setup.sh 00:06:00.635 ************************************ 00:06:00.893 21:31:02 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:04.177 Hugepages 00:06:04.177 node hugesize free / total 00:06:04.177 node0 1048576kB 0 / 0 00:06:04.177 node0 2048kB 1024 / 1024 00:06:04.177 node1 1048576kB 0 / 0 00:06:04.177 node1 2048kB 1024 / 1024 00:06:04.177 00:06:04.177 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:04.177 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:04.177 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:04.177 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:04.177 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:04.177 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:04.178 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:04.178 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:04.178 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:04.178 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:04.178 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:04.178 21:31:05 -- spdk/autotest.sh@117 -- # uname -s 00:06:04.178 21:31:05 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:04.178 21:31:05 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:04.178 21:31:05 -- common/autotest_common.sh@1512 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:07.602 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:07.602 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:08.980 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:09.238 21:31:10 -- common/autotest_common.sh@1513 -- # sleep 1 00:06:10.173 21:31:11 -- common/autotest_common.sh@1514 -- # bdfs=() 00:06:10.173 21:31:11 -- common/autotest_common.sh@1514 -- # local bdfs 00:06:10.173 21:31:11 -- common/autotest_common.sh@1516 -- # bdfs=($(get_nvme_bdfs)) 00:06:10.173 21:31:11 -- common/autotest_common.sh@1516 -- # get_nvme_bdfs 00:06:10.173 21:31:11 -- common/autotest_common.sh@1494 -- # bdfs=() 00:06:10.173 21:31:11 -- common/autotest_common.sh@1494 -- # local bdfs 00:06:10.173 21:31:11 -- common/autotest_common.sh@1495 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:10.173 21:31:11 -- common/autotest_common.sh@1495 -- # jq -r '.config[].params.traddr' 00:06:10.173 21:31:11 -- common/autotest_common.sh@1495 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:10.432 21:31:11 -- common/autotest_common.sh@1496 -- # (( 1 == 0 )) 00:06:10.432 21:31:11 -- common/autotest_common.sh@1500 -- # printf '%s\n' 0000:d8:00.0 00:06:10.432 21:31:11 -- common/autotest_common.sh@1518 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:13.719 Waiting for block devices as requested 00:06:13.719 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:13.719 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:13.719 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:13.719 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:13.719 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:13.719 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:13.719 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:13.979 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:13.979 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:13.979 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:14.238 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:14.238 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:14.238 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:14.497 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:14.497 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:14.497 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:14.756 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:14.756 21:31:16 -- common/autotest_common.sh@1520 -- # for bdf in "${bdfs[@]}" 00:06:14.756 21:31:16 -- common/autotest_common.sh@1521 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:14.756 21:31:16 -- common/autotest_common.sh@1483 -- # readlink -f /sys/class/nvme/nvme0 00:06:14.757 21:31:16 -- common/autotest_common.sh@1483 -- # grep 0000:d8:00.0/nvme/nvme 00:06:14.757 21:31:16 -- common/autotest_common.sh@1483 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:14.757 21:31:16 -- common/autotest_common.sh@1484 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:14.757 21:31:16 -- common/autotest_common.sh@1488 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:14.757 21:31:16 -- common/autotest_common.sh@1488 -- # printf '%s\n' nvme0 00:06:14.757 21:31:16 -- common/autotest_common.sh@1521 -- # nvme_ctrlr=/dev/nvme0 00:06:14.757 21:31:16 -- common/autotest_common.sh@1522 -- # [[ -z /dev/nvme0 ]] 00:06:14.757 21:31:16 -- common/autotest_common.sh@1527 -- # nvme id-ctrl /dev/nvme0 00:06:14.757 21:31:16 -- common/autotest_common.sh@1527 -- # grep oacs 00:06:14.757 21:31:16 -- common/autotest_common.sh@1527 -- # cut -d: -f2 00:06:15.017 21:31:16 -- common/autotest_common.sh@1527 -- # oacs=' 0xe' 00:06:15.017 21:31:16 -- common/autotest_common.sh@1528 -- # oacs_ns_manage=8 00:06:15.017 21:31:16 -- common/autotest_common.sh@1530 -- # [[ 8 -ne 0 ]] 00:06:15.017 21:31:16 -- common/autotest_common.sh@1536 -- # nvme id-ctrl /dev/nvme0 00:06:15.017 21:31:16 -- common/autotest_common.sh@1536 -- # cut -d: -f2 00:06:15.017 21:31:16 -- common/autotest_common.sh@1536 -- # grep unvmcap 00:06:15.017 21:31:16 -- common/autotest_common.sh@1536 -- # unvmcap=' 0' 00:06:15.017 21:31:16 -- common/autotest_common.sh@1537 -- # [[ 0 -eq 0 ]] 00:06:15.017 21:31:16 -- common/autotest_common.sh@1539 -- # continue 00:06:15.017 21:31:16 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:15.017 21:31:16 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:15.017 21:31:16 -- common/autotest_common.sh@10 -- # set +x 00:06:15.017 21:31:16 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:15.017 21:31:16 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:15.017 21:31:16 -- common/autotest_common.sh@10 -- # set +x 00:06:15.017 21:31:16 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:18.303 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:18.303 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:18.304 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:18.304 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:18.304 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:18.304 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:18.304 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:18.304 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:18.562 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:19.942 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:19.942 21:31:21 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:19.942 21:31:21 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:19.942 21:31:21 -- common/autotest_common.sh@10 -- # set +x 00:06:20.201 21:31:21 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:20.201 21:31:21 -- common/autotest_common.sh@1574 -- # mapfile -t bdfs 00:06:20.201 21:31:21 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs_by_id 0x0a54 00:06:20.201 21:31:21 -- common/autotest_common.sh@1559 -- # bdfs=() 00:06:20.201 21:31:21 -- common/autotest_common.sh@1559 -- # _bdfs=() 00:06:20.201 21:31:21 -- common/autotest_common.sh@1559 -- # local bdfs _bdfs 00:06:20.201 21:31:21 -- common/autotest_common.sh@1560 -- # _bdfs=($(get_nvme_bdfs)) 00:06:20.201 21:31:21 -- common/autotest_common.sh@1560 -- # get_nvme_bdfs 00:06:20.201 21:31:21 -- common/autotest_common.sh@1494 -- # bdfs=() 00:06:20.201 21:31:21 -- common/autotest_common.sh@1494 -- # local bdfs 00:06:20.201 21:31:21 -- common/autotest_common.sh@1495 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:20.201 21:31:21 -- common/autotest_common.sh@1495 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:20.201 21:31:21 -- common/autotest_common.sh@1495 -- # jq -r '.config[].params.traddr' 00:06:20.201 21:31:21 -- common/autotest_common.sh@1496 -- # (( 1 == 0 )) 00:06:20.201 21:31:21 -- common/autotest_common.sh@1500 -- # printf '%s\n' 0000:d8:00.0 00:06:20.201 21:31:21 -- common/autotest_common.sh@1561 -- # for bdf in "${_bdfs[@]}" 00:06:20.201 21:31:21 -- common/autotest_common.sh@1562 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:20.201 21:31:21 -- common/autotest_common.sh@1562 -- # device=0x0a54 00:06:20.201 21:31:21 -- common/autotest_common.sh@1563 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:20.201 21:31:21 -- common/autotest_common.sh@1564 -- # bdfs+=($bdf) 00:06:20.201 21:31:21 -- common/autotest_common.sh@1568 -- # (( 1 > 0 )) 00:06:20.201 21:31:21 -- common/autotest_common.sh@1569 -- # printf '%s\n' 0000:d8:00.0 00:06:20.201 21:31:21 -- common/autotest_common.sh@1575 -- # [[ -z 0000:d8:00.0 ]] 00:06:20.201 21:31:21 -- common/autotest_common.sh@1580 -- # spdk_tgt_pid=3303324 00:06:20.201 21:31:21 -- common/autotest_common.sh@1581 -- # waitforlisten 3303324 00:06:20.201 21:31:21 -- common/autotest_common.sh@1579 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:20.201 21:31:21 -- common/autotest_common.sh@831 -- # '[' -z 3303324 ']' 00:06:20.201 21:31:21 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:20.201 21:31:21 -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:20.201 21:31:21 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:20.201 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:20.201 21:31:21 -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:20.201 21:31:21 -- common/autotest_common.sh@10 -- # set +x 00:06:20.201 [2024-10-27 21:31:21.837747] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:20.201 [2024-10-27 21:31:21.837815] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3303324 ] 00:06:20.459 [2024-10-27 21:31:21.972165] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:20.459 [2024-10-27 21:31:22.006746] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.459 [2024-10-27 21:31:22.028522] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.025 21:31:22 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:21.025 21:31:22 -- common/autotest_common.sh@864 -- # return 0 00:06:21.025 21:31:22 -- common/autotest_common.sh@1583 -- # bdf_id=0 00:06:21.025 21:31:22 -- common/autotest_common.sh@1584 -- # for bdf in "${bdfs[@]}" 00:06:21.025 21:31:22 -- common/autotest_common.sh@1585 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:24.312 nvme0n1 00:06:24.312 21:31:25 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:24.312 [2024-10-27 21:31:25.858698] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:24.312 request: 00:06:24.312 { 00:06:24.312 "nvme_ctrlr_name": "nvme0", 00:06:24.312 "password": "test", 00:06:24.312 "method": "bdev_nvme_opal_revert", 00:06:24.312 "req_id": 1 00:06:24.312 } 00:06:24.312 Got JSON-RPC error response 00:06:24.312 response: 00:06:24.312 { 00:06:24.312 "code": -32602, 00:06:24.312 "message": "Invalid parameters" 00:06:24.312 } 00:06:24.312 21:31:25 -- common/autotest_common.sh@1587 -- # true 00:06:24.312 21:31:25 -- common/autotest_common.sh@1588 -- # (( ++bdf_id )) 00:06:24.312 21:31:25 -- common/autotest_common.sh@1591 -- # killprocess 3303324 00:06:24.312 21:31:25 -- common/autotest_common.sh@950 -- # '[' -z 3303324 ']' 00:06:24.312 21:31:25 -- common/autotest_common.sh@954 -- # kill -0 3303324 00:06:24.312 21:31:25 -- common/autotest_common.sh@955 -- # uname 00:06:24.312 21:31:25 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:24.312 21:31:25 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3303324 00:06:24.312 21:31:25 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:24.312 21:31:25 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:24.312 21:31:25 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3303324' 00:06:24.312 killing process with pid 3303324 00:06:24.312 21:31:25 -- common/autotest_common.sh@969 -- # kill 3303324 00:06:24.312 21:31:25 -- common/autotest_common.sh@974 -- # wait 3303324 00:06:26.845 21:31:28 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:26.845 21:31:28 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:26.845 21:31:28 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:26.845 21:31:28 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:26.845 21:31:28 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:26.845 21:31:28 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:26.845 21:31:28 -- common/autotest_common.sh@10 -- # set +x 00:06:26.845 21:31:28 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:26.845 21:31:28 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:26.845 21:31:28 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.845 21:31:28 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.845 21:31:28 -- common/autotest_common.sh@10 -- # set +x 00:06:26.845 ************************************ 00:06:26.845 START TEST env 00:06:26.845 ************************************ 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:26.845 * Looking for test storage... 00:06:26.845 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1689 -- # lcov --version 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:26.845 21:31:28 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:26.845 21:31:28 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:26.845 21:31:28 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:26.845 21:31:28 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:26.845 21:31:28 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:26.845 21:31:28 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:26.845 21:31:28 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:26.845 21:31:28 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:26.845 21:31:28 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:26.845 21:31:28 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:26.845 21:31:28 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:26.845 21:31:28 env -- scripts/common.sh@344 -- # case "$op" in 00:06:26.845 21:31:28 env -- scripts/common.sh@345 -- # : 1 00:06:26.845 21:31:28 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:26.845 21:31:28 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:26.845 21:31:28 env -- scripts/common.sh@365 -- # decimal 1 00:06:26.845 21:31:28 env -- scripts/common.sh@353 -- # local d=1 00:06:26.845 21:31:28 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:26.845 21:31:28 env -- scripts/common.sh@355 -- # echo 1 00:06:26.845 21:31:28 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:26.845 21:31:28 env -- scripts/common.sh@366 -- # decimal 2 00:06:26.845 21:31:28 env -- scripts/common.sh@353 -- # local d=2 00:06:26.845 21:31:28 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:26.845 21:31:28 env -- scripts/common.sh@355 -- # echo 2 00:06:26.845 21:31:28 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:26.845 21:31:28 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:26.845 21:31:28 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:26.845 21:31:28 env -- scripts/common.sh@368 -- # return 0 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:26.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.845 --rc genhtml_branch_coverage=1 00:06:26.845 --rc genhtml_function_coverage=1 00:06:26.845 --rc genhtml_legend=1 00:06:26.845 --rc geninfo_all_blocks=1 00:06:26.845 --rc geninfo_unexecuted_blocks=1 00:06:26.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.845 ' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:26.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.845 --rc genhtml_branch_coverage=1 00:06:26.845 --rc genhtml_function_coverage=1 00:06:26.845 --rc genhtml_legend=1 00:06:26.845 --rc geninfo_all_blocks=1 00:06:26.845 --rc geninfo_unexecuted_blocks=1 00:06:26.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.845 ' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:26.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.845 --rc genhtml_branch_coverage=1 00:06:26.845 --rc genhtml_function_coverage=1 00:06:26.845 --rc genhtml_legend=1 00:06:26.845 --rc geninfo_all_blocks=1 00:06:26.845 --rc geninfo_unexecuted_blocks=1 00:06:26.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.845 ' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:26.845 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.845 --rc genhtml_branch_coverage=1 00:06:26.845 --rc genhtml_function_coverage=1 00:06:26.845 --rc genhtml_legend=1 00:06:26.845 --rc geninfo_all_blocks=1 00:06:26.845 --rc geninfo_unexecuted_blocks=1 00:06:26.845 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.845 ' 00:06:26.845 21:31:28 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.845 21:31:28 env -- common/autotest_common.sh@10 -- # set +x 00:06:26.845 ************************************ 00:06:26.845 START TEST env_memory 00:06:26.845 ************************************ 00:06:26.845 21:31:28 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:26.845 00:06:26.845 00:06:26.845 CUnit - A unit testing framework for C - Version 2.1-3 00:06:26.845 http://cunit.sourceforge.net/ 00:06:26.845 00:06:26.845 00:06:26.845 Suite: memory 00:06:26.845 Test: alloc and free memory map ...[2024-10-27 21:31:28.399890] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:26.845 passed 00:06:26.845 Test: mem map translation ...[2024-10-27 21:31:28.416836] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:26.845 [2024-10-27 21:31:28.416854] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:26.845 [2024-10-27 21:31:28.416909] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:26.845 [2024-10-27 21:31:28.416919] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:26.845 passed 00:06:26.845 Test: mem map registration ...[2024-10-27 21:31:28.437798] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:26.845 [2024-10-27 21:31:28.437814] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:26.845 passed 00:06:26.845 Test: mem map adjacent registrations ...passed 00:06:26.845 00:06:26.845 Run Summary: Type Total Ran Passed Failed Inactive 00:06:26.845 suites 1 1 n/a 0 0 00:06:26.845 tests 4 4 4 0 0 00:06:26.845 asserts 152 152 152 0 n/a 00:06:26.845 00:06:26.845 Elapsed time = 0.087 seconds 00:06:26.845 00:06:26.845 real 0m0.099s 00:06:26.845 user 0m0.087s 00:06:26.845 sys 0m0.011s 00:06:26.845 21:31:28 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:26.845 21:31:28 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:26.845 ************************************ 00:06:26.845 END TEST env_memory 00:06:26.845 ************************************ 00:06:26.845 21:31:28 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.845 21:31:28 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.845 21:31:28 env -- common/autotest_common.sh@10 -- # set +x 00:06:26.845 ************************************ 00:06:26.845 START TEST env_vtophys 00:06:26.845 ************************************ 00:06:26.845 21:31:28 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:26.845 EAL: lib.eal log level changed from notice to debug 00:06:26.845 EAL: Detected lcore 0 as core 0 on socket 0 00:06:26.845 EAL: Detected lcore 1 as core 1 on socket 0 00:06:26.845 EAL: Detected lcore 2 as core 2 on socket 0 00:06:26.845 EAL: Detected lcore 3 as core 3 on socket 0 00:06:26.845 EAL: Detected lcore 4 as core 4 on socket 0 00:06:26.845 EAL: Detected lcore 5 as core 5 on socket 0 00:06:26.845 EAL: Detected lcore 6 as core 6 on socket 0 00:06:26.845 EAL: Detected lcore 7 as core 8 on socket 0 00:06:26.845 EAL: Detected lcore 8 as core 9 on socket 0 00:06:26.846 EAL: Detected lcore 9 as core 10 on socket 0 00:06:26.846 EAL: Detected lcore 10 as core 11 on socket 0 00:06:26.846 EAL: Detected lcore 11 as core 12 on socket 0 00:06:26.846 EAL: Detected lcore 12 as core 13 on socket 0 00:06:26.846 EAL: Detected lcore 13 as core 14 on socket 0 00:06:26.846 EAL: Detected lcore 14 as core 16 on socket 0 00:06:26.846 EAL: Detected lcore 15 as core 17 on socket 0 00:06:26.846 EAL: Detected lcore 16 as core 18 on socket 0 00:06:26.846 EAL: Detected lcore 17 as core 19 on socket 0 00:06:26.846 EAL: Detected lcore 18 as core 20 on socket 0 00:06:26.846 EAL: Detected lcore 19 as core 21 on socket 0 00:06:26.846 EAL: Detected lcore 20 as core 22 on socket 0 00:06:26.846 EAL: Detected lcore 21 as core 24 on socket 0 00:06:26.846 EAL: Detected lcore 22 as core 25 on socket 0 00:06:26.846 EAL: Detected lcore 23 as core 26 on socket 0 00:06:26.846 EAL: Detected lcore 24 as core 27 on socket 0 00:06:26.846 EAL: Detected lcore 25 as core 28 on socket 0 00:06:26.846 EAL: Detected lcore 26 as core 29 on socket 0 00:06:26.846 EAL: Detected lcore 27 as core 30 on socket 0 00:06:26.846 EAL: Detected lcore 28 as core 0 on socket 1 00:06:26.846 EAL: Detected lcore 29 as core 1 on socket 1 00:06:26.846 EAL: Detected lcore 30 as core 2 on socket 1 00:06:26.846 EAL: Detected lcore 31 as core 3 on socket 1 00:06:26.846 EAL: Detected lcore 32 as core 4 on socket 1 00:06:26.846 EAL: Detected lcore 33 as core 5 on socket 1 00:06:26.846 EAL: Detected lcore 34 as core 6 on socket 1 00:06:26.846 EAL: Detected lcore 35 as core 8 on socket 1 00:06:26.846 EAL: Detected lcore 36 as core 9 on socket 1 00:06:26.846 EAL: Detected lcore 37 as core 10 on socket 1 00:06:26.846 EAL: Detected lcore 38 as core 11 on socket 1 00:06:26.846 EAL: Detected lcore 39 as core 12 on socket 1 00:06:26.846 EAL: Detected lcore 40 as core 13 on socket 1 00:06:26.846 EAL: Detected lcore 41 as core 14 on socket 1 00:06:26.846 EAL: Detected lcore 42 as core 16 on socket 1 00:06:26.846 EAL: Detected lcore 43 as core 17 on socket 1 00:06:26.846 EAL: Detected lcore 44 as core 18 on socket 1 00:06:26.846 EAL: Detected lcore 45 as core 19 on socket 1 00:06:26.846 EAL: Detected lcore 46 as core 20 on socket 1 00:06:26.846 EAL: Detected lcore 47 as core 21 on socket 1 00:06:26.846 EAL: Detected lcore 48 as core 22 on socket 1 00:06:26.846 EAL: Detected lcore 49 as core 24 on socket 1 00:06:26.846 EAL: Detected lcore 50 as core 25 on socket 1 00:06:26.846 EAL: Detected lcore 51 as core 26 on socket 1 00:06:26.846 EAL: Detected lcore 52 as core 27 on socket 1 00:06:26.846 EAL: Detected lcore 53 as core 28 on socket 1 00:06:26.846 EAL: Detected lcore 54 as core 29 on socket 1 00:06:26.846 EAL: Detected lcore 55 as core 30 on socket 1 00:06:26.846 EAL: Detected lcore 56 as core 0 on socket 0 00:06:26.846 EAL: Detected lcore 57 as core 1 on socket 0 00:06:26.846 EAL: Detected lcore 58 as core 2 on socket 0 00:06:26.846 EAL: Detected lcore 59 as core 3 on socket 0 00:06:26.846 EAL: Detected lcore 60 as core 4 on socket 0 00:06:26.846 EAL: Detected lcore 61 as core 5 on socket 0 00:06:26.846 EAL: Detected lcore 62 as core 6 on socket 0 00:06:26.846 EAL: Detected lcore 63 as core 8 on socket 0 00:06:26.846 EAL: Detected lcore 64 as core 9 on socket 0 00:06:26.846 EAL: Detected lcore 65 as core 10 on socket 0 00:06:26.846 EAL: Detected lcore 66 as core 11 on socket 0 00:06:26.846 EAL: Detected lcore 67 as core 12 on socket 0 00:06:26.846 EAL: Detected lcore 68 as core 13 on socket 0 00:06:26.846 EAL: Detected lcore 69 as core 14 on socket 0 00:06:26.846 EAL: Detected lcore 70 as core 16 on socket 0 00:06:26.846 EAL: Detected lcore 71 as core 17 on socket 0 00:06:26.846 EAL: Detected lcore 72 as core 18 on socket 0 00:06:26.846 EAL: Detected lcore 73 as core 19 on socket 0 00:06:26.846 EAL: Detected lcore 74 as core 20 on socket 0 00:06:26.846 EAL: Detected lcore 75 as core 21 on socket 0 00:06:26.846 EAL: Detected lcore 76 as core 22 on socket 0 00:06:26.846 EAL: Detected lcore 77 as core 24 on socket 0 00:06:26.846 EAL: Detected lcore 78 as core 25 on socket 0 00:06:26.846 EAL: Detected lcore 79 as core 26 on socket 0 00:06:26.846 EAL: Detected lcore 80 as core 27 on socket 0 00:06:26.846 EAL: Detected lcore 81 as core 28 on socket 0 00:06:26.846 EAL: Detected lcore 82 as core 29 on socket 0 00:06:26.846 EAL: Detected lcore 83 as core 30 on socket 0 00:06:26.846 EAL: Detected lcore 84 as core 0 on socket 1 00:06:26.846 EAL: Detected lcore 85 as core 1 on socket 1 00:06:26.846 EAL: Detected lcore 86 as core 2 on socket 1 00:06:26.846 EAL: Detected lcore 87 as core 3 on socket 1 00:06:26.846 EAL: Detected lcore 88 as core 4 on socket 1 00:06:26.846 EAL: Detected lcore 89 as core 5 on socket 1 00:06:26.846 EAL: Detected lcore 90 as core 6 on socket 1 00:06:26.846 EAL: Detected lcore 91 as core 8 on socket 1 00:06:26.846 EAL: Detected lcore 92 as core 9 on socket 1 00:06:26.846 EAL: Detected lcore 93 as core 10 on socket 1 00:06:26.846 EAL: Detected lcore 94 as core 11 on socket 1 00:06:26.846 EAL: Detected lcore 95 as core 12 on socket 1 00:06:26.846 EAL: Detected lcore 96 as core 13 on socket 1 00:06:26.846 EAL: Detected lcore 97 as core 14 on socket 1 00:06:26.846 EAL: Detected lcore 98 as core 16 on socket 1 00:06:26.846 EAL: Detected lcore 99 as core 17 on socket 1 00:06:26.846 EAL: Detected lcore 100 as core 18 on socket 1 00:06:26.846 EAL: Detected lcore 101 as core 19 on socket 1 00:06:26.846 EAL: Detected lcore 102 as core 20 on socket 1 00:06:26.846 EAL: Detected lcore 103 as core 21 on socket 1 00:06:26.846 EAL: Detected lcore 104 as core 22 on socket 1 00:06:26.846 EAL: Detected lcore 105 as core 24 on socket 1 00:06:26.846 EAL: Detected lcore 106 as core 25 on socket 1 00:06:26.846 EAL: Detected lcore 107 as core 26 on socket 1 00:06:26.846 EAL: Detected lcore 108 as core 27 on socket 1 00:06:26.846 EAL: Detected lcore 109 as core 28 on socket 1 00:06:26.846 EAL: Detected lcore 110 as core 29 on socket 1 00:06:26.846 EAL: Detected lcore 111 as core 30 on socket 1 00:06:27.106 EAL: Maximum logical cores by configuration: 128 00:06:27.106 EAL: Detected CPU lcores: 112 00:06:27.106 EAL: Detected NUMA nodes: 2 00:06:27.106 EAL: Checking presence of .so 'librte_eal.so.25.0' 00:06:27.106 EAL: Checking presence of .so 'librte_eal.so.25' 00:06:27.106 EAL: Checking presence of .so 'librte_eal.so' 00:06:27.106 EAL: Detected static linkage of DPDK 00:06:27.106 EAL: No shared files mode enabled, IPC will be disabled 00:06:27.106 EAL: Bus pci wants IOVA as 'DC' 00:06:27.106 EAL: Buses did not request a specific IOVA mode. 00:06:27.106 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:27.106 EAL: Selected IOVA mode 'VA' 00:06:27.106 EAL: Probing VFIO support... 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: IOMMU type 1 (Type 1) is supported 00:06:27.106 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:27.106 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:27.106 EAL: VFIO support initialized 00:06:27.106 EAL: Ask a virtual area of 0x2e000 bytes 00:06:27.106 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:27.106 EAL: Setting up physically contiguous memory... 00:06:27.106 EAL: Setting maximum number of open files to 524288 00:06:27.106 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:27.106 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:27.106 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:27.106 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:27.106 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.106 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:27.106 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.106 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.106 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:27.106 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:27.106 EAL: Hugepages will be freed exactly as allocated. 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Refined arch frequency 2500000000 to measured frequency 2494140772 00:06:27.106 EAL: TSC frequency is ~2494100 KHz 00:06:27.106 EAL: Main lcore 0 is ready (tid=7feca8e0ba00;cpuset=[0]) 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 0 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 2MB 00:06:27.106 EAL: Mem event callback 'spdk:(nil)' registered 00:06:27.106 00:06:27.106 00:06:27.106 CUnit - A unit testing framework for C - Version 2.1-3 00:06:27.106 http://cunit.sourceforge.net/ 00:06:27.106 00:06:27.106 00:06:27.106 Suite: components_suite 00:06:27.106 Test: vtophys_malloc_test ...passed 00:06:27.106 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 4MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was shrunk by 4MB 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 6MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was shrunk by 6MB 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 10MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was shrunk by 10MB 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 18MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was shrunk by 18MB 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 34MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was shrunk by 34MB 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 66MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was shrunk by 66MB 00:06:27.106 EAL: Trying to obtain current memory policy. 00:06:27.106 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.106 EAL: Restoring previous memory policy: 4 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.106 EAL: request: mp_malloc_sync 00:06:27.106 EAL: No shared files mode enabled, IPC is disabled 00:06:27.106 EAL: Heap on socket 0 was expanded by 130MB 00:06:27.106 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.365 EAL: request: mp_malloc_sync 00:06:27.365 EAL: No shared files mode enabled, IPC is disabled 00:06:27.365 EAL: Heap on socket 0 was shrunk by 130MB 00:06:27.365 EAL: Trying to obtain current memory policy. 00:06:27.365 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.365 EAL: Restoring previous memory policy: 4 00:06:27.365 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.365 EAL: request: mp_malloc_sync 00:06:27.365 EAL: No shared files mode enabled, IPC is disabled 00:06:27.365 EAL: Heap on socket 0 was expanded by 258MB 00:06:27.365 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.365 EAL: request: mp_malloc_sync 00:06:27.365 EAL: No shared files mode enabled, IPC is disabled 00:06:27.365 EAL: Heap on socket 0 was shrunk by 258MB 00:06:27.365 EAL: Trying to obtain current memory policy. 00:06:27.365 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.365 EAL: Restoring previous memory policy: 4 00:06:27.365 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.365 EAL: request: mp_malloc_sync 00:06:27.365 EAL: No shared files mode enabled, IPC is disabled 00:06:27.365 EAL: Heap on socket 0 was expanded by 514MB 00:06:27.624 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.624 EAL: request: mp_malloc_sync 00:06:27.624 EAL: No shared files mode enabled, IPC is disabled 00:06:27.624 EAL: Heap on socket 0 was shrunk by 514MB 00:06:27.624 EAL: Trying to obtain current memory policy. 00:06:27.624 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.883 EAL: Restoring previous memory policy: 4 00:06:27.883 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.883 EAL: request: mp_malloc_sync 00:06:27.883 EAL: No shared files mode enabled, IPC is disabled 00:06:27.883 EAL: Heap on socket 0 was expanded by 1026MB 00:06:27.883 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.141 EAL: request: mp_malloc_sync 00:06:28.141 EAL: No shared files mode enabled, IPC is disabled 00:06:28.141 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:28.141 passed 00:06:28.141 00:06:28.141 Run Summary: Type Total Ran Passed Failed Inactive 00:06:28.141 suites 1 1 n/a 0 0 00:06:28.141 tests 2 2 2 0 0 00:06:28.141 asserts 497 497 497 0 n/a 00:06:28.141 00:06:28.141 Elapsed time = 0.967 seconds 00:06:28.141 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.141 EAL: request: mp_malloc_sync 00:06:28.141 EAL: No shared files mode enabled, IPC is disabled 00:06:28.141 EAL: Heap on socket 0 was shrunk by 2MB 00:06:28.141 EAL: No shared files mode enabled, IPC is disabled 00:06:28.141 EAL: No shared files mode enabled, IPC is disabled 00:06:28.141 EAL: No shared files mode enabled, IPC is disabled 00:06:28.141 00:06:28.141 real 0m1.194s 00:06:28.141 user 0m0.641s 00:06:28.141 sys 0m0.421s 00:06:28.141 21:31:29 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:28.141 21:31:29 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:28.141 ************************************ 00:06:28.141 END TEST env_vtophys 00:06:28.141 ************************************ 00:06:28.141 21:31:29 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:28.141 21:31:29 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:28.141 21:31:29 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.141 21:31:29 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.141 ************************************ 00:06:28.141 START TEST env_pci 00:06:28.141 ************************************ 00:06:28.141 21:31:29 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:28.141 00:06:28.141 00:06:28.141 CUnit - A unit testing framework for C - Version 2.1-3 00:06:28.141 http://cunit.sourceforge.net/ 00:06:28.141 00:06:28.141 00:06:28.141 Suite: pci 00:06:28.141 Test: pci_hook ...[2024-10-27 21:31:29.820874] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1050:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3304859 has claimed it 00:06:28.141 EAL: Cannot find device (10000:00:01.0) 00:06:28.141 EAL: Failed to attach device on primary process 00:06:28.141 passed 00:06:28.141 00:06:28.141 Run Summary: Type Total Ran Passed Failed Inactive 00:06:28.141 suites 1 1 n/a 0 0 00:06:28.141 tests 1 1 1 0 0 00:06:28.141 asserts 25 25 25 0 n/a 00:06:28.141 00:06:28.141 Elapsed time = 0.034 seconds 00:06:28.141 00:06:28.141 real 0m0.052s 00:06:28.141 user 0m0.010s 00:06:28.141 sys 0m0.042s 00:06:28.141 21:31:29 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:28.141 21:31:29 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:28.141 ************************************ 00:06:28.141 END TEST env_pci 00:06:28.141 ************************************ 00:06:28.398 21:31:29 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:28.398 21:31:29 env -- env/env.sh@15 -- # uname 00:06:28.398 21:31:29 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:28.398 21:31:29 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:28.398 21:31:29 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:28.398 21:31:29 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:28.398 21:31:29 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.398 21:31:29 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.398 ************************************ 00:06:28.398 START TEST env_dpdk_post_init 00:06:28.398 ************************************ 00:06:28.398 21:31:29 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:28.398 EAL: Detected CPU lcores: 112 00:06:28.398 EAL: Detected NUMA nodes: 2 00:06:28.398 EAL: Detected static linkage of DPDK 00:06:28.398 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:28.398 EAL: Selected IOVA mode 'VA' 00:06:28.398 EAL: VFIO support initialized 00:06:28.655 EAL: Using IOMMU type 1 (Type 1) 00:06:33.921 Starting DPDK initialization... 00:06:33.921 Starting SPDK post initialization... 00:06:33.921 SPDK NVMe probe 00:06:33.921 Attaching to 0000:d8:00.0 00:06:33.921 Attached to 0000:d8:00.0 00:06:33.921 Cleaning up... 00:06:33.921 00:06:33.921 real 0m4.850s 00:06:33.921 user 0m3.353s 00:06:33.921 sys 0m0.646s 00:06:33.921 21:31:34 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:33.921 21:31:34 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:33.921 ************************************ 00:06:33.921 END TEST env_dpdk_post_init 00:06:33.921 ************************************ 00:06:33.921 21:31:34 env -- env/env.sh@26 -- # uname 00:06:33.921 21:31:34 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:33.921 21:31:34 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:33.921 21:31:34 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:33.921 21:31:34 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:33.921 21:31:34 env -- common/autotest_common.sh@10 -- # set +x 00:06:33.921 ************************************ 00:06:33.921 START TEST env_mem_callbacks 00:06:33.921 ************************************ 00:06:33.921 21:31:34 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:33.921 EAL: Detected CPU lcores: 112 00:06:33.921 EAL: Detected NUMA nodes: 2 00:06:33.921 EAL: Detected static linkage of DPDK 00:06:33.921 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:33.921 EAL: Selected IOVA mode 'VA' 00:06:33.921 EAL: VFIO support initialized 00:06:33.921 00:06:33.921 00:06:33.921 CUnit - A unit testing framework for C - Version 2.1-3 00:06:33.921 http://cunit.sourceforge.net/ 00:06:33.921 00:06:33.921 00:06:33.921 Suite: memory 00:06:33.921 Test: test ... 00:06:33.921 register 0x200000200000 2097152 00:06:33.921 malloc 3145728 00:06:33.921 register 0x200000400000 4194304 00:06:33.921 buf 0x200000500000 len 3145728 PASSED 00:06:33.921 malloc 64 00:06:33.921 buf 0x2000004fff40 len 64 PASSED 00:06:33.921 malloc 4194304 00:06:33.921 register 0x200000800000 6291456 00:06:33.921 buf 0x200000a00000 len 4194304 PASSED 00:06:33.921 free 0x200000500000 3145728 00:06:33.921 free 0x2000004fff40 64 00:06:33.921 unregister 0x200000400000 4194304 PASSED 00:06:33.921 free 0x200000a00000 4194304 00:06:33.921 unregister 0x200000800000 6291456 PASSED 00:06:33.921 malloc 8388608 00:06:33.921 register 0x200000400000 10485760 00:06:33.921 buf 0x200000600000 len 8388608 PASSED 00:06:33.921 free 0x200000600000 8388608 00:06:33.921 unregister 0x200000400000 10485760 PASSED 00:06:33.921 passed 00:06:33.921 00:06:33.921 Run Summary: Type Total Ran Passed Failed Inactive 00:06:33.921 suites 1 1 n/a 0 0 00:06:33.921 tests 1 1 1 0 0 00:06:33.921 asserts 15 15 15 0 n/a 00:06:33.921 00:06:33.921 Elapsed time = 0.006 seconds 00:06:33.921 00:06:33.921 real 0m0.165s 00:06:33.921 user 0m0.020s 00:06:33.921 sys 0m0.045s 00:06:33.921 21:31:35 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:33.921 21:31:35 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:33.921 ************************************ 00:06:33.921 END TEST env_mem_callbacks 00:06:33.921 ************************************ 00:06:33.921 00:06:33.921 real 0m6.909s 00:06:33.921 user 0m4.315s 00:06:33.921 sys 0m1.556s 00:06:33.921 21:31:35 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:33.921 21:31:35 env -- common/autotest_common.sh@10 -- # set +x 00:06:33.921 ************************************ 00:06:33.921 END TEST env 00:06:33.922 ************************************ 00:06:33.922 21:31:35 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:33.922 21:31:35 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:33.922 21:31:35 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:33.922 21:31:35 -- common/autotest_common.sh@10 -- # set +x 00:06:33.922 ************************************ 00:06:33.922 START TEST rpc 00:06:33.922 ************************************ 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:33.922 * Looking for test storage... 00:06:33.922 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1689 -- # lcov --version 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:33.922 21:31:35 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:33.922 21:31:35 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:33.922 21:31:35 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:33.922 21:31:35 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:33.922 21:31:35 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:33.922 21:31:35 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:33.922 21:31:35 rpc -- scripts/common.sh@345 -- # : 1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:33.922 21:31:35 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:33.922 21:31:35 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@353 -- # local d=1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:33.922 21:31:35 rpc -- scripts/common.sh@355 -- # echo 1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:33.922 21:31:35 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@353 -- # local d=2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:33.922 21:31:35 rpc -- scripts/common.sh@355 -- # echo 2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:33.922 21:31:35 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:33.922 21:31:35 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:33.922 21:31:35 rpc -- scripts/common.sh@368 -- # return 0 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:33.922 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.922 --rc genhtml_branch_coverage=1 00:06:33.922 --rc genhtml_function_coverage=1 00:06:33.922 --rc genhtml_legend=1 00:06:33.922 --rc geninfo_all_blocks=1 00:06:33.922 --rc geninfo_unexecuted_blocks=1 00:06:33.922 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.922 ' 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:33.922 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.922 --rc genhtml_branch_coverage=1 00:06:33.922 --rc genhtml_function_coverage=1 00:06:33.922 --rc genhtml_legend=1 00:06:33.922 --rc geninfo_all_blocks=1 00:06:33.922 --rc geninfo_unexecuted_blocks=1 00:06:33.922 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.922 ' 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:33.922 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.922 --rc genhtml_branch_coverage=1 00:06:33.922 --rc genhtml_function_coverage=1 00:06:33.922 --rc genhtml_legend=1 00:06:33.922 --rc geninfo_all_blocks=1 00:06:33.922 --rc geninfo_unexecuted_blocks=1 00:06:33.922 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.922 ' 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:33.922 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.922 --rc genhtml_branch_coverage=1 00:06:33.922 --rc genhtml_function_coverage=1 00:06:33.922 --rc genhtml_legend=1 00:06:33.922 --rc geninfo_all_blocks=1 00:06:33.922 --rc geninfo_unexecuted_blocks=1 00:06:33.922 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.922 ' 00:06:33.922 21:31:35 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:33.922 21:31:35 rpc -- rpc/rpc.sh@65 -- # spdk_pid=3305787 00:06:33.922 21:31:35 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:33.922 21:31:35 rpc -- rpc/rpc.sh@67 -- # waitforlisten 3305787 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@831 -- # '[' -z 3305787 ']' 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:33.922 21:31:35 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:33.922 [2024-10-27 21:31:35.337277] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:33.922 [2024-10-27 21:31:35.337326] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3305787 ] 00:06:33.922 [2024-10-27 21:31:35.470432] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:33.922 [2024-10-27 21:31:35.506302] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.922 [2024-10-27 21:31:35.528040] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:33.922 [2024-10-27 21:31:35.528075] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3305787' to capture a snapshot of events at runtime. 00:06:33.922 [2024-10-27 21:31:35.528084] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:33.922 [2024-10-27 21:31:35.528093] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:33.922 [2024-10-27 21:31:35.528099] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3305787 for offline analysis/debug. 00:06:33.922 [2024-10-27 21:31:35.528714] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.489 21:31:36 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:34.489 21:31:36 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:34.489 21:31:36 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:34.489 21:31:36 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:34.489 21:31:36 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:34.489 21:31:36 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:34.489 21:31:36 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:34.489 21:31:36 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:34.489 21:31:36 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.748 ************************************ 00:06:34.748 START TEST rpc_integrity 00:06:34.748 ************************************ 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.748 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.748 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:34.748 { 00:06:34.748 "name": "Malloc0", 00:06:34.748 "aliases": [ 00:06:34.748 "70d62f6d-8c4b-4ca0-9879-644a39732e03" 00:06:34.748 ], 00:06:34.748 "product_name": "Malloc disk", 00:06:34.748 "block_size": 512, 00:06:34.748 "num_blocks": 16384, 00:06:34.748 "uuid": "70d62f6d-8c4b-4ca0-9879-644a39732e03", 00:06:34.748 "assigned_rate_limits": { 00:06:34.748 "rw_ios_per_sec": 0, 00:06:34.748 "rw_mbytes_per_sec": 0, 00:06:34.748 "r_mbytes_per_sec": 0, 00:06:34.748 "w_mbytes_per_sec": 0 00:06:34.748 }, 00:06:34.748 "claimed": false, 00:06:34.748 "zoned": false, 00:06:34.748 "supported_io_types": { 00:06:34.748 "read": true, 00:06:34.748 "write": true, 00:06:34.748 "unmap": true, 00:06:34.748 "flush": true, 00:06:34.748 "reset": true, 00:06:34.748 "nvme_admin": false, 00:06:34.748 "nvme_io": false, 00:06:34.748 "nvme_io_md": false, 00:06:34.748 "write_zeroes": true, 00:06:34.748 "zcopy": true, 00:06:34.748 "get_zone_info": false, 00:06:34.748 "zone_management": false, 00:06:34.748 "zone_append": false, 00:06:34.748 "compare": false, 00:06:34.748 "compare_and_write": false, 00:06:34.748 "abort": true, 00:06:34.748 "seek_hole": false, 00:06:34.748 "seek_data": false, 00:06:34.748 "copy": true, 00:06:34.748 "nvme_iov_md": false 00:06:34.748 }, 00:06:34.748 "memory_domains": [ 00:06:34.748 { 00:06:34.748 "dma_device_id": "system", 00:06:34.748 "dma_device_type": 1 00:06:34.748 }, 00:06:34.748 { 00:06:34.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:34.748 "dma_device_type": 2 00:06:34.748 } 00:06:34.748 ], 00:06:34.748 "driver_specific": {} 00:06:34.748 } 00:06:34.748 ]' 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.749 [2024-10-27 21:31:36.376399] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:34.749 [2024-10-27 21:31:36.376431] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:34.749 [2024-10-27 21:31:36.376447] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x49bac10 00:06:34.749 [2024-10-27 21:31:36.376457] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:34.749 [2024-10-27 21:31:36.377287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:34.749 [2024-10-27 21:31:36.377310] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:34.749 Passthru0 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:34.749 { 00:06:34.749 "name": "Malloc0", 00:06:34.749 "aliases": [ 00:06:34.749 "70d62f6d-8c4b-4ca0-9879-644a39732e03" 00:06:34.749 ], 00:06:34.749 "product_name": "Malloc disk", 00:06:34.749 "block_size": 512, 00:06:34.749 "num_blocks": 16384, 00:06:34.749 "uuid": "70d62f6d-8c4b-4ca0-9879-644a39732e03", 00:06:34.749 "assigned_rate_limits": { 00:06:34.749 "rw_ios_per_sec": 0, 00:06:34.749 "rw_mbytes_per_sec": 0, 00:06:34.749 "r_mbytes_per_sec": 0, 00:06:34.749 "w_mbytes_per_sec": 0 00:06:34.749 }, 00:06:34.749 "claimed": true, 00:06:34.749 "claim_type": "exclusive_write", 00:06:34.749 "zoned": false, 00:06:34.749 "supported_io_types": { 00:06:34.749 "read": true, 00:06:34.749 "write": true, 00:06:34.749 "unmap": true, 00:06:34.749 "flush": true, 00:06:34.749 "reset": true, 00:06:34.749 "nvme_admin": false, 00:06:34.749 "nvme_io": false, 00:06:34.749 "nvme_io_md": false, 00:06:34.749 "write_zeroes": true, 00:06:34.749 "zcopy": true, 00:06:34.749 "get_zone_info": false, 00:06:34.749 "zone_management": false, 00:06:34.749 "zone_append": false, 00:06:34.749 "compare": false, 00:06:34.749 "compare_and_write": false, 00:06:34.749 "abort": true, 00:06:34.749 "seek_hole": false, 00:06:34.749 "seek_data": false, 00:06:34.749 "copy": true, 00:06:34.749 "nvme_iov_md": false 00:06:34.749 }, 00:06:34.749 "memory_domains": [ 00:06:34.749 { 00:06:34.749 "dma_device_id": "system", 00:06:34.749 "dma_device_type": 1 00:06:34.749 }, 00:06:34.749 { 00:06:34.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:34.749 "dma_device_type": 2 00:06:34.749 } 00:06:34.749 ], 00:06:34.749 "driver_specific": {} 00:06:34.749 }, 00:06:34.749 { 00:06:34.749 "name": "Passthru0", 00:06:34.749 "aliases": [ 00:06:34.749 "87c17c49-b0a0-53ef-bb2d-817dcfbbd39f" 00:06:34.749 ], 00:06:34.749 "product_name": "passthru", 00:06:34.749 "block_size": 512, 00:06:34.749 "num_blocks": 16384, 00:06:34.749 "uuid": "87c17c49-b0a0-53ef-bb2d-817dcfbbd39f", 00:06:34.749 "assigned_rate_limits": { 00:06:34.749 "rw_ios_per_sec": 0, 00:06:34.749 "rw_mbytes_per_sec": 0, 00:06:34.749 "r_mbytes_per_sec": 0, 00:06:34.749 "w_mbytes_per_sec": 0 00:06:34.749 }, 00:06:34.749 "claimed": false, 00:06:34.749 "zoned": false, 00:06:34.749 "supported_io_types": { 00:06:34.749 "read": true, 00:06:34.749 "write": true, 00:06:34.749 "unmap": true, 00:06:34.749 "flush": true, 00:06:34.749 "reset": true, 00:06:34.749 "nvme_admin": false, 00:06:34.749 "nvme_io": false, 00:06:34.749 "nvme_io_md": false, 00:06:34.749 "write_zeroes": true, 00:06:34.749 "zcopy": true, 00:06:34.749 "get_zone_info": false, 00:06:34.749 "zone_management": false, 00:06:34.749 "zone_append": false, 00:06:34.749 "compare": false, 00:06:34.749 "compare_and_write": false, 00:06:34.749 "abort": true, 00:06:34.749 "seek_hole": false, 00:06:34.749 "seek_data": false, 00:06:34.749 "copy": true, 00:06:34.749 "nvme_iov_md": false 00:06:34.749 }, 00:06:34.749 "memory_domains": [ 00:06:34.749 { 00:06:34.749 "dma_device_id": "system", 00:06:34.749 "dma_device_type": 1 00:06:34.749 }, 00:06:34.749 { 00:06:34.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:34.749 "dma_device_type": 2 00:06:34.749 } 00:06:34.749 ], 00:06:34.749 "driver_specific": { 00:06:34.749 "passthru": { 00:06:34.749 "name": "Passthru0", 00:06:34.749 "base_bdev_name": "Malloc0" 00:06:34.749 } 00:06:34.749 } 00:06:34.749 } 00:06:34.749 ]' 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.749 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:34.749 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:35.007 21:31:36 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:35.007 00:06:35.007 real 0m0.276s 00:06:35.008 user 0m0.164s 00:06:35.008 sys 0m0.054s 00:06:35.008 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.008 21:31:36 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 ************************************ 00:06:35.008 END TEST rpc_integrity 00:06:35.008 ************************************ 00:06:35.008 21:31:36 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:35.008 21:31:36 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.008 21:31:36 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.008 21:31:36 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 ************************************ 00:06:35.008 START TEST rpc_plugins 00:06:35.008 ************************************ 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:35.008 { 00:06:35.008 "name": "Malloc1", 00:06:35.008 "aliases": [ 00:06:35.008 "825c7cdd-ee8b-4df5-8375-28e69f287747" 00:06:35.008 ], 00:06:35.008 "product_name": "Malloc disk", 00:06:35.008 "block_size": 4096, 00:06:35.008 "num_blocks": 256, 00:06:35.008 "uuid": "825c7cdd-ee8b-4df5-8375-28e69f287747", 00:06:35.008 "assigned_rate_limits": { 00:06:35.008 "rw_ios_per_sec": 0, 00:06:35.008 "rw_mbytes_per_sec": 0, 00:06:35.008 "r_mbytes_per_sec": 0, 00:06:35.008 "w_mbytes_per_sec": 0 00:06:35.008 }, 00:06:35.008 "claimed": false, 00:06:35.008 "zoned": false, 00:06:35.008 "supported_io_types": { 00:06:35.008 "read": true, 00:06:35.008 "write": true, 00:06:35.008 "unmap": true, 00:06:35.008 "flush": true, 00:06:35.008 "reset": true, 00:06:35.008 "nvme_admin": false, 00:06:35.008 "nvme_io": false, 00:06:35.008 "nvme_io_md": false, 00:06:35.008 "write_zeroes": true, 00:06:35.008 "zcopy": true, 00:06:35.008 "get_zone_info": false, 00:06:35.008 "zone_management": false, 00:06:35.008 "zone_append": false, 00:06:35.008 "compare": false, 00:06:35.008 "compare_and_write": false, 00:06:35.008 "abort": true, 00:06:35.008 "seek_hole": false, 00:06:35.008 "seek_data": false, 00:06:35.008 "copy": true, 00:06:35.008 "nvme_iov_md": false 00:06:35.008 }, 00:06:35.008 "memory_domains": [ 00:06:35.008 { 00:06:35.008 "dma_device_id": "system", 00:06:35.008 "dma_device_type": 1 00:06:35.008 }, 00:06:35.008 { 00:06:35.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.008 "dma_device_type": 2 00:06:35.008 } 00:06:35.008 ], 00:06:35.008 "driver_specific": {} 00:06:35.008 } 00:06:35.008 ]' 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:35.008 21:31:36 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:35.008 00:06:35.008 real 0m0.120s 00:06:35.008 user 0m0.069s 00:06:35.008 sys 0m0.018s 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.008 21:31:36 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 ************************************ 00:06:35.008 END TEST rpc_plugins 00:06:35.008 ************************************ 00:06:35.266 21:31:36 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:35.267 21:31:36 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.267 21:31:36 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.267 21:31:36 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.267 ************************************ 00:06:35.267 START TEST rpc_trace_cmd_test 00:06:35.267 ************************************ 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:35.267 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3305787", 00:06:35.267 "tpoint_group_mask": "0x8", 00:06:35.267 "iscsi_conn": { 00:06:35.267 "mask": "0x2", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "scsi": { 00:06:35.267 "mask": "0x4", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "bdev": { 00:06:35.267 "mask": "0x8", 00:06:35.267 "tpoint_mask": "0xffffffffffffffff" 00:06:35.267 }, 00:06:35.267 "nvmf_rdma": { 00:06:35.267 "mask": "0x10", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "nvmf_tcp": { 00:06:35.267 "mask": "0x20", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "ftl": { 00:06:35.267 "mask": "0x40", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "blobfs": { 00:06:35.267 "mask": "0x80", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "dsa": { 00:06:35.267 "mask": "0x200", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "thread": { 00:06:35.267 "mask": "0x400", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "nvme_pcie": { 00:06:35.267 "mask": "0x800", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "iaa": { 00:06:35.267 "mask": "0x1000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "nvme_tcp": { 00:06:35.267 "mask": "0x2000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "bdev_nvme": { 00:06:35.267 "mask": "0x4000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "sock": { 00:06:35.267 "mask": "0x8000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "blob": { 00:06:35.267 "mask": "0x10000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "bdev_raid": { 00:06:35.267 "mask": "0x20000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 }, 00:06:35.267 "scheduler": { 00:06:35.267 "mask": "0x40000", 00:06:35.267 "tpoint_mask": "0x0" 00:06:35.267 } 00:06:35.267 }' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:35.267 00:06:35.267 real 0m0.191s 00:06:35.267 user 0m0.152s 00:06:35.267 sys 0m0.032s 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.267 21:31:36 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:35.267 ************************************ 00:06:35.267 END TEST rpc_trace_cmd_test 00:06:35.267 ************************************ 00:06:35.526 21:31:37 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:35.526 21:31:37 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:35.526 21:31:37 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:35.526 21:31:37 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.526 21:31:37 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.526 21:31:37 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.526 ************************************ 00:06:35.526 START TEST rpc_daemon_integrity 00:06:35.526 ************************************ 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:35.526 { 00:06:35.526 "name": "Malloc2", 00:06:35.526 "aliases": [ 00:06:35.526 "ed26b64b-9dca-43c2-9297-4356b9e610e2" 00:06:35.526 ], 00:06:35.526 "product_name": "Malloc disk", 00:06:35.526 "block_size": 512, 00:06:35.526 "num_blocks": 16384, 00:06:35.526 "uuid": "ed26b64b-9dca-43c2-9297-4356b9e610e2", 00:06:35.526 "assigned_rate_limits": { 00:06:35.526 "rw_ios_per_sec": 0, 00:06:35.526 "rw_mbytes_per_sec": 0, 00:06:35.526 "r_mbytes_per_sec": 0, 00:06:35.526 "w_mbytes_per_sec": 0 00:06:35.526 }, 00:06:35.526 "claimed": false, 00:06:35.526 "zoned": false, 00:06:35.526 "supported_io_types": { 00:06:35.526 "read": true, 00:06:35.526 "write": true, 00:06:35.526 "unmap": true, 00:06:35.526 "flush": true, 00:06:35.526 "reset": true, 00:06:35.526 "nvme_admin": false, 00:06:35.526 "nvme_io": false, 00:06:35.526 "nvme_io_md": false, 00:06:35.526 "write_zeroes": true, 00:06:35.526 "zcopy": true, 00:06:35.526 "get_zone_info": false, 00:06:35.526 "zone_management": false, 00:06:35.526 "zone_append": false, 00:06:35.526 "compare": false, 00:06:35.526 "compare_and_write": false, 00:06:35.526 "abort": true, 00:06:35.526 "seek_hole": false, 00:06:35.526 "seek_data": false, 00:06:35.526 "copy": true, 00:06:35.526 "nvme_iov_md": false 00:06:35.526 }, 00:06:35.526 "memory_domains": [ 00:06:35.526 { 00:06:35.526 "dma_device_id": "system", 00:06:35.526 "dma_device_type": 1 00:06:35.526 }, 00:06:35.526 { 00:06:35.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.526 "dma_device_type": 2 00:06:35.526 } 00:06:35.526 ], 00:06:35.526 "driver_specific": {} 00:06:35.526 } 00:06:35.526 ]' 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.526 [2024-10-27 21:31:37.204568] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:35.526 [2024-10-27 21:31:37.204599] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:35.526 [2024-10-27 21:31:37.204620] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4add700 00:06:35.526 [2024-10-27 21:31:37.204629] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:35.526 [2024-10-27 21:31:37.205371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:35.526 [2024-10-27 21:31:37.205395] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:35.526 Passthru0 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:35.526 { 00:06:35.526 "name": "Malloc2", 00:06:35.526 "aliases": [ 00:06:35.526 "ed26b64b-9dca-43c2-9297-4356b9e610e2" 00:06:35.526 ], 00:06:35.526 "product_name": "Malloc disk", 00:06:35.526 "block_size": 512, 00:06:35.526 "num_blocks": 16384, 00:06:35.526 "uuid": "ed26b64b-9dca-43c2-9297-4356b9e610e2", 00:06:35.526 "assigned_rate_limits": { 00:06:35.526 "rw_ios_per_sec": 0, 00:06:35.526 "rw_mbytes_per_sec": 0, 00:06:35.526 "r_mbytes_per_sec": 0, 00:06:35.526 "w_mbytes_per_sec": 0 00:06:35.526 }, 00:06:35.526 "claimed": true, 00:06:35.526 "claim_type": "exclusive_write", 00:06:35.526 "zoned": false, 00:06:35.526 "supported_io_types": { 00:06:35.526 "read": true, 00:06:35.526 "write": true, 00:06:35.526 "unmap": true, 00:06:35.526 "flush": true, 00:06:35.526 "reset": true, 00:06:35.526 "nvme_admin": false, 00:06:35.526 "nvme_io": false, 00:06:35.526 "nvme_io_md": false, 00:06:35.526 "write_zeroes": true, 00:06:35.526 "zcopy": true, 00:06:35.526 "get_zone_info": false, 00:06:35.526 "zone_management": false, 00:06:35.526 "zone_append": false, 00:06:35.526 "compare": false, 00:06:35.526 "compare_and_write": false, 00:06:35.526 "abort": true, 00:06:35.526 "seek_hole": false, 00:06:35.526 "seek_data": false, 00:06:35.526 "copy": true, 00:06:35.526 "nvme_iov_md": false 00:06:35.526 }, 00:06:35.526 "memory_domains": [ 00:06:35.526 { 00:06:35.526 "dma_device_id": "system", 00:06:35.526 "dma_device_type": 1 00:06:35.526 }, 00:06:35.526 { 00:06:35.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.526 "dma_device_type": 2 00:06:35.526 } 00:06:35.526 ], 00:06:35.526 "driver_specific": {} 00:06:35.526 }, 00:06:35.526 { 00:06:35.526 "name": "Passthru0", 00:06:35.526 "aliases": [ 00:06:35.526 "8f2db597-86fd-5f4a-bcc9-dbc62f35484d" 00:06:35.526 ], 00:06:35.526 "product_name": "passthru", 00:06:35.526 "block_size": 512, 00:06:35.526 "num_blocks": 16384, 00:06:35.526 "uuid": "8f2db597-86fd-5f4a-bcc9-dbc62f35484d", 00:06:35.526 "assigned_rate_limits": { 00:06:35.526 "rw_ios_per_sec": 0, 00:06:35.526 "rw_mbytes_per_sec": 0, 00:06:35.526 "r_mbytes_per_sec": 0, 00:06:35.526 "w_mbytes_per_sec": 0 00:06:35.526 }, 00:06:35.526 "claimed": false, 00:06:35.526 "zoned": false, 00:06:35.526 "supported_io_types": { 00:06:35.526 "read": true, 00:06:35.526 "write": true, 00:06:35.526 "unmap": true, 00:06:35.526 "flush": true, 00:06:35.526 "reset": true, 00:06:35.526 "nvme_admin": false, 00:06:35.526 "nvme_io": false, 00:06:35.526 "nvme_io_md": false, 00:06:35.526 "write_zeroes": true, 00:06:35.526 "zcopy": true, 00:06:35.526 "get_zone_info": false, 00:06:35.526 "zone_management": false, 00:06:35.526 "zone_append": false, 00:06:35.526 "compare": false, 00:06:35.526 "compare_and_write": false, 00:06:35.526 "abort": true, 00:06:35.526 "seek_hole": false, 00:06:35.526 "seek_data": false, 00:06:35.526 "copy": true, 00:06:35.526 "nvme_iov_md": false 00:06:35.526 }, 00:06:35.526 "memory_domains": [ 00:06:35.526 { 00:06:35.526 "dma_device_id": "system", 00:06:35.526 "dma_device_type": 1 00:06:35.526 }, 00:06:35.526 { 00:06:35.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.526 "dma_device_type": 2 00:06:35.526 } 00:06:35.526 ], 00:06:35.526 "driver_specific": { 00:06:35.526 "passthru": { 00:06:35.526 "name": "Passthru0", 00:06:35.526 "base_bdev_name": "Malloc2" 00:06:35.526 } 00:06:35.526 } 00:06:35.526 } 00:06:35.526 ]' 00:06:35.526 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:35.786 00:06:35.786 real 0m0.270s 00:06:35.786 user 0m0.172s 00:06:35.786 sys 0m0.038s 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.786 21:31:37 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.786 ************************************ 00:06:35.786 END TEST rpc_daemon_integrity 00:06:35.786 ************************************ 00:06:35.786 21:31:37 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:35.786 21:31:37 rpc -- rpc/rpc.sh@84 -- # killprocess 3305787 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@950 -- # '[' -z 3305787 ']' 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@954 -- # kill -0 3305787 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@955 -- # uname 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3305787 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3305787' 00:06:35.786 killing process with pid 3305787 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@969 -- # kill 3305787 00:06:35.786 21:31:37 rpc -- common/autotest_common.sh@974 -- # wait 3305787 00:06:36.045 00:06:36.045 real 0m2.569s 00:06:36.045 user 0m3.133s 00:06:36.045 sys 0m0.805s 00:06:36.045 21:31:37 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:36.045 21:31:37 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.045 ************************************ 00:06:36.045 END TEST rpc 00:06:36.045 ************************************ 00:06:36.045 21:31:37 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:36.045 21:31:37 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:36.045 21:31:37 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:36.045 21:31:37 -- common/autotest_common.sh@10 -- # set +x 00:06:36.304 ************************************ 00:06:36.304 START TEST skip_rpc 00:06:36.304 ************************************ 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:36.304 * Looking for test storage... 00:06:36.304 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1689 -- # lcov --version 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:36.304 21:31:37 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:36.304 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.304 --rc genhtml_branch_coverage=1 00:06:36.304 --rc genhtml_function_coverage=1 00:06:36.304 --rc genhtml_legend=1 00:06:36.304 --rc geninfo_all_blocks=1 00:06:36.304 --rc geninfo_unexecuted_blocks=1 00:06:36.304 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.304 ' 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:36.304 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.304 --rc genhtml_branch_coverage=1 00:06:36.304 --rc genhtml_function_coverage=1 00:06:36.304 --rc genhtml_legend=1 00:06:36.304 --rc geninfo_all_blocks=1 00:06:36.304 --rc geninfo_unexecuted_blocks=1 00:06:36.304 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.304 ' 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:36.304 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.304 --rc genhtml_branch_coverage=1 00:06:36.304 --rc genhtml_function_coverage=1 00:06:36.304 --rc genhtml_legend=1 00:06:36.304 --rc geninfo_all_blocks=1 00:06:36.304 --rc geninfo_unexecuted_blocks=1 00:06:36.304 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.304 ' 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:36.304 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.304 --rc genhtml_branch_coverage=1 00:06:36.304 --rc genhtml_function_coverage=1 00:06:36.304 --rc genhtml_legend=1 00:06:36.304 --rc geninfo_all_blocks=1 00:06:36.304 --rc geninfo_unexecuted_blocks=1 00:06:36.304 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.304 ' 00:06:36.304 21:31:37 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:36.304 21:31:37 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:36.304 21:31:37 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:36.304 21:31:37 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.304 ************************************ 00:06:36.304 START TEST skip_rpc 00:06:36.304 ************************************ 00:06:36.304 21:31:38 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:36.304 21:31:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=3306503 00:06:36.304 21:31:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:36.304 21:31:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:36.304 21:31:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:36.583 [2024-10-27 21:31:38.049656] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:36.583 [2024-10-27 21:31:38.049717] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3306503 ] 00:06:36.583 [2024-10-27 21:31:38.184622] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:36.583 [2024-10-27 21:31:38.218717] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.583 [2024-10-27 21:31:38.240936] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:41.851 21:31:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 3306503 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 3306503 ']' 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 3306503 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3306503 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3306503' 00:06:41.852 killing process with pid 3306503 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 3306503 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 3306503 00:06:41.852 00:06:41.852 real 0m5.370s 00:06:41.852 user 0m5.042s 00:06:41.852 sys 0m0.284s 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:41.852 21:31:43 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:41.852 ************************************ 00:06:41.852 END TEST skip_rpc 00:06:41.852 ************************************ 00:06:41.852 21:31:43 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:41.852 21:31:43 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:41.852 21:31:43 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:41.852 21:31:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:41.852 ************************************ 00:06:41.852 START TEST skip_rpc_with_json 00:06:41.852 ************************************ 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=3307499 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 3307499 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 3307499 ']' 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:41.852 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:41.852 21:31:43 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:41.852 [2024-10-27 21:31:43.505439] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:41.852 [2024-10-27 21:31:43.505520] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3307499 ] 00:06:42.111 [2024-10-27 21:31:43.642385] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:42.111 [2024-10-27 21:31:43.678226] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.111 [2024-10-27 21:31:43.700404] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:42.677 [2024-10-27 21:31:44.353865] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:42.677 request: 00:06:42.677 { 00:06:42.677 "trtype": "tcp", 00:06:42.677 "method": "nvmf_get_transports", 00:06:42.677 "req_id": 1 00:06:42.677 } 00:06:42.677 Got JSON-RPC error response 00:06:42.677 response: 00:06:42.677 { 00:06:42.677 "code": -19, 00:06:42.677 "message": "No such device" 00:06:42.677 } 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:42.677 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:42.677 [2024-10-27 21:31:44.365925] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:42.678 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:42.678 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:42.678 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:42.678 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:42.937 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:42.937 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:42.937 { 00:06:42.937 "subsystems": [ 00:06:42.937 { 00:06:42.937 "subsystem": "scheduler", 00:06:42.937 "config": [ 00:06:42.937 { 00:06:42.937 "method": "framework_set_scheduler", 00:06:42.937 "params": { 00:06:42.937 "name": "static" 00:06:42.937 } 00:06:42.937 } 00:06:42.937 ] 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "vmd", 00:06:42.937 "config": [] 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "sock", 00:06:42.937 "config": [ 00:06:42.937 { 00:06:42.937 "method": "sock_set_default_impl", 00:06:42.937 "params": { 00:06:42.937 "impl_name": "posix" 00:06:42.937 } 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "method": "sock_impl_set_options", 00:06:42.937 "params": { 00:06:42.937 "impl_name": "ssl", 00:06:42.937 "recv_buf_size": 4096, 00:06:42.937 "send_buf_size": 4096, 00:06:42.937 "enable_recv_pipe": true, 00:06:42.937 "enable_quickack": false, 00:06:42.937 "enable_placement_id": 0, 00:06:42.937 "enable_zerocopy_send_server": true, 00:06:42.937 "enable_zerocopy_send_client": false, 00:06:42.937 "zerocopy_threshold": 0, 00:06:42.937 "tls_version": 0, 00:06:42.937 "enable_ktls": false 00:06:42.937 } 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "method": "sock_impl_set_options", 00:06:42.937 "params": { 00:06:42.937 "impl_name": "posix", 00:06:42.937 "recv_buf_size": 2097152, 00:06:42.937 "send_buf_size": 2097152, 00:06:42.937 "enable_recv_pipe": true, 00:06:42.937 "enable_quickack": false, 00:06:42.937 "enable_placement_id": 0, 00:06:42.937 "enable_zerocopy_send_server": true, 00:06:42.937 "enable_zerocopy_send_client": false, 00:06:42.937 "zerocopy_threshold": 0, 00:06:42.937 "tls_version": 0, 00:06:42.937 "enable_ktls": false 00:06:42.937 } 00:06:42.937 } 00:06:42.937 ] 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "iobuf", 00:06:42.937 "config": [ 00:06:42.937 { 00:06:42.937 "method": "iobuf_set_options", 00:06:42.937 "params": { 00:06:42.937 "small_pool_count": 8192, 00:06:42.937 "large_pool_count": 1024, 00:06:42.937 "small_bufsize": 8192, 00:06:42.937 "large_bufsize": 135168, 00:06:42.937 "enable_numa": false 00:06:42.937 } 00:06:42.937 } 00:06:42.937 ] 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "keyring", 00:06:42.937 "config": [] 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "vfio_user_target", 00:06:42.937 "config": null 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "fsdev", 00:06:42.937 "config": [ 00:06:42.937 { 00:06:42.937 "method": "fsdev_set_opts", 00:06:42.937 "params": { 00:06:42.937 "fsdev_io_pool_size": 65535, 00:06:42.937 "fsdev_io_cache_size": 256 00:06:42.937 } 00:06:42.937 } 00:06:42.937 ] 00:06:42.937 }, 00:06:42.937 { 00:06:42.937 "subsystem": "accel", 00:06:42.937 "config": [ 00:06:42.937 { 00:06:42.937 "method": "accel_set_options", 00:06:42.937 "params": { 00:06:42.937 "small_cache_size": 128, 00:06:42.937 "large_cache_size": 16, 00:06:42.937 "task_count": 2048, 00:06:42.937 "sequence_count": 2048, 00:06:42.937 "buf_count": 2048 00:06:42.937 } 00:06:42.938 } 00:06:42.938 ] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "bdev", 00:06:42.938 "config": [ 00:06:42.938 { 00:06:42.938 "method": "bdev_set_options", 00:06:42.938 "params": { 00:06:42.938 "bdev_io_pool_size": 65535, 00:06:42.938 "bdev_io_cache_size": 256, 00:06:42.938 "bdev_auto_examine": true, 00:06:42.938 "iobuf_small_cache_size": 128, 00:06:42.938 "iobuf_large_cache_size": 16 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "bdev_raid_set_options", 00:06:42.938 "params": { 00:06:42.938 "process_window_size_kb": 1024, 00:06:42.938 "process_max_bandwidth_mb_sec": 0 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "bdev_nvme_set_options", 00:06:42.938 "params": { 00:06:42.938 "action_on_timeout": "none", 00:06:42.938 "timeout_us": 0, 00:06:42.938 "timeout_admin_us": 0, 00:06:42.938 "keep_alive_timeout_ms": 10000, 00:06:42.938 "arbitration_burst": 0, 00:06:42.938 "low_priority_weight": 0, 00:06:42.938 "medium_priority_weight": 0, 00:06:42.938 "high_priority_weight": 0, 00:06:42.938 "nvme_adminq_poll_period_us": 10000, 00:06:42.938 "nvme_ioq_poll_period_us": 0, 00:06:42.938 "io_queue_requests": 0, 00:06:42.938 "delay_cmd_submit": true, 00:06:42.938 "transport_retry_count": 4, 00:06:42.938 "bdev_retry_count": 3, 00:06:42.938 "transport_ack_timeout": 0, 00:06:42.938 "ctrlr_loss_timeout_sec": 0, 00:06:42.938 "reconnect_delay_sec": 0, 00:06:42.938 "fast_io_fail_timeout_sec": 0, 00:06:42.938 "disable_auto_failback": false, 00:06:42.938 "generate_uuids": false, 00:06:42.938 "transport_tos": 0, 00:06:42.938 "nvme_error_stat": false, 00:06:42.938 "rdma_srq_size": 0, 00:06:42.938 "io_path_stat": false, 00:06:42.938 "allow_accel_sequence": false, 00:06:42.938 "rdma_max_cq_size": 0, 00:06:42.938 "rdma_cm_event_timeout_ms": 0, 00:06:42.938 "dhchap_digests": [ 00:06:42.938 "sha256", 00:06:42.938 "sha384", 00:06:42.938 "sha512" 00:06:42.938 ], 00:06:42.938 "dhchap_dhgroups": [ 00:06:42.938 "null", 00:06:42.938 "ffdhe2048", 00:06:42.938 "ffdhe3072", 00:06:42.938 "ffdhe4096", 00:06:42.938 "ffdhe6144", 00:06:42.938 "ffdhe8192" 00:06:42.938 ] 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "bdev_nvme_set_hotplug", 00:06:42.938 "params": { 00:06:42.938 "period_us": 100000, 00:06:42.938 "enable": false 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "bdev_iscsi_set_options", 00:06:42.938 "params": { 00:06:42.938 "timeout_sec": 30 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "bdev_wait_for_examine" 00:06:42.938 } 00:06:42.938 ] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "nvmf", 00:06:42.938 "config": [ 00:06:42.938 { 00:06:42.938 "method": "nvmf_set_config", 00:06:42.938 "params": { 00:06:42.938 "discovery_filter": "match_any", 00:06:42.938 "admin_cmd_passthru": { 00:06:42.938 "identify_ctrlr": false 00:06:42.938 }, 00:06:42.938 "dhchap_digests": [ 00:06:42.938 "sha256", 00:06:42.938 "sha384", 00:06:42.938 "sha512" 00:06:42.938 ], 00:06:42.938 "dhchap_dhgroups": [ 00:06:42.938 "null", 00:06:42.938 "ffdhe2048", 00:06:42.938 "ffdhe3072", 00:06:42.938 "ffdhe4096", 00:06:42.938 "ffdhe6144", 00:06:42.938 "ffdhe8192" 00:06:42.938 ] 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "nvmf_set_max_subsystems", 00:06:42.938 "params": { 00:06:42.938 "max_subsystems": 1024 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "nvmf_set_crdt", 00:06:42.938 "params": { 00:06:42.938 "crdt1": 0, 00:06:42.938 "crdt2": 0, 00:06:42.938 "crdt3": 0 00:06:42.938 } 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "method": "nvmf_create_transport", 00:06:42.938 "params": { 00:06:42.938 "trtype": "TCP", 00:06:42.938 "max_queue_depth": 128, 00:06:42.938 "max_io_qpairs_per_ctrlr": 127, 00:06:42.938 "in_capsule_data_size": 4096, 00:06:42.938 "max_io_size": 131072, 00:06:42.938 "io_unit_size": 131072, 00:06:42.938 "max_aq_depth": 128, 00:06:42.938 "num_shared_buffers": 511, 00:06:42.938 "buf_cache_size": 4294967295, 00:06:42.938 "dif_insert_or_strip": false, 00:06:42.938 "zcopy": false, 00:06:42.938 "c2h_success": true, 00:06:42.938 "sock_priority": 0, 00:06:42.938 "abort_timeout_sec": 1, 00:06:42.938 "ack_timeout": 0, 00:06:42.938 "data_wr_pool_size": 0 00:06:42.938 } 00:06:42.938 } 00:06:42.938 ] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "nbd", 00:06:42.938 "config": [] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "ublk", 00:06:42.938 "config": [] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "vhost_blk", 00:06:42.938 "config": [] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "scsi", 00:06:42.938 "config": null 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "iscsi", 00:06:42.938 "config": [ 00:06:42.938 { 00:06:42.938 "method": "iscsi_set_options", 00:06:42.938 "params": { 00:06:42.938 "node_base": "iqn.2016-06.io.spdk", 00:06:42.938 "max_sessions": 128, 00:06:42.938 "max_connections_per_session": 2, 00:06:42.938 "max_queue_depth": 64, 00:06:42.938 "default_time2wait": 2, 00:06:42.938 "default_time2retain": 20, 00:06:42.938 "first_burst_length": 8192, 00:06:42.938 "immediate_data": true, 00:06:42.938 "allow_duplicated_isid": false, 00:06:42.938 "error_recovery_level": 0, 00:06:42.938 "nop_timeout": 60, 00:06:42.938 "nop_in_interval": 30, 00:06:42.938 "disable_chap": false, 00:06:42.938 "require_chap": false, 00:06:42.938 "mutual_chap": false, 00:06:42.938 "chap_group": 0, 00:06:42.938 "max_large_datain_per_connection": 64, 00:06:42.938 "max_r2t_per_connection": 4, 00:06:42.938 "pdu_pool_size": 36864, 00:06:42.938 "immediate_data_pool_size": 16384, 00:06:42.938 "data_out_pool_size": 2048 00:06:42.938 } 00:06:42.938 } 00:06:42.938 ] 00:06:42.938 }, 00:06:42.938 { 00:06:42.938 "subsystem": "vhost_scsi", 00:06:42.938 "config": [] 00:06:42.938 } 00:06:42.938 ] 00:06:42.938 } 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 3307499 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 3307499 ']' 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 3307499 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3307499 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:42.938 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3307499' 00:06:42.938 killing process with pid 3307499 00:06:42.939 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 3307499 00:06:42.939 21:31:44 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 3307499 00:06:43.198 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=3307716 00:06:43.198 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:43.198 21:31:44 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 3307716 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 3307716 ']' 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 3307716 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3307716 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3307716' 00:06:48.466 killing process with pid 3307716 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 3307716 00:06:48.466 21:31:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 3307716 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:48.725 00:06:48.725 real 0m6.781s 00:06:48.725 user 0m6.423s 00:06:48.725 sys 0m0.663s 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:48.725 ************************************ 00:06:48.725 END TEST skip_rpc_with_json 00:06:48.725 ************************************ 00:06:48.725 21:31:50 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:48.725 21:31:50 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:48.725 21:31:50 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.725 21:31:50 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.725 ************************************ 00:06:48.725 START TEST skip_rpc_with_delay 00:06:48.725 ************************************ 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:48.725 [2024-10-27 21:31:50.368540] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:48.725 00:06:48.725 real 0m0.047s 00:06:48.725 user 0m0.018s 00:06:48.725 sys 0m0.029s 00:06:48.725 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.726 21:31:50 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:48.726 ************************************ 00:06:48.726 END TEST skip_rpc_with_delay 00:06:48.726 ************************************ 00:06:48.726 21:31:50 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:48.726 21:31:50 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:48.726 21:31:50 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:48.726 21:31:50 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:48.726 21:31:50 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.726 21:31:50 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.985 ************************************ 00:06:48.985 START TEST exit_on_failed_rpc_init 00:06:48.985 ************************************ 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=3308724 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 3308724 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 3308724 ']' 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:48.985 21:31:50 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:48.985 [2024-10-27 21:31:50.493463] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:48.985 [2024-10-27 21:31:50.493524] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3308724 ] 00:06:48.985 [2024-10-27 21:31:50.630818] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:48.985 [2024-10-27 21:31:50.665650] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.985 [2024-10-27 21:31:50.687544] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.920 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:49.920 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:06:49.920 21:31:51 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.921 [2024-10-27 21:31:51.374322] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:49.921 [2024-10-27 21:31:51.374387] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3308962 ] 00:06:49.921 [2024-10-27 21:31:51.510279] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:49.921 [2024-10-27 21:31:51.545164] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.921 [2024-10-27 21:31:51.567069] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:49.921 [2024-10-27 21:31:51.567140] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:49.921 [2024-10-27 21:31:51.567152] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:49.921 [2024-10-27 21:31:51.567160] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 3308724 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 3308724 ']' 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 3308724 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:49.921 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3308724 00:06:50.179 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:50.179 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:50.179 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3308724' 00:06:50.179 killing process with pid 3308724 00:06:50.180 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 3308724 00:06:50.180 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 3308724 00:06:50.438 00:06:50.438 real 0m1.477s 00:06:50.438 user 0m1.508s 00:06:50.438 sys 0m0.441s 00:06:50.438 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.438 21:31:51 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:50.438 ************************************ 00:06:50.438 END TEST exit_on_failed_rpc_init 00:06:50.438 ************************************ 00:06:50.438 21:31:51 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:50.438 00:06:50.438 real 0m14.180s 00:06:50.438 user 0m13.205s 00:06:50.438 sys 0m1.746s 00:06:50.438 21:31:51 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.438 21:31:51 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.438 ************************************ 00:06:50.438 END TEST skip_rpc 00:06:50.438 ************************************ 00:06:50.438 21:31:52 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:50.438 21:31:52 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.438 21:31:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.438 21:31:52 -- common/autotest_common.sh@10 -- # set +x 00:06:50.438 ************************************ 00:06:50.438 START TEST rpc_client 00:06:50.438 ************************************ 00:06:50.439 21:31:52 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:50.439 * Looking for test storage... 00:06:50.439 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:50.439 21:31:52 rpc_client -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:50.439 21:31:52 rpc_client -- common/autotest_common.sh@1689 -- # lcov --version 00:06:50.439 21:31:52 rpc_client -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.698 21:31:52 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:50.698 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.698 --rc genhtml_branch_coverage=1 00:06:50.698 --rc genhtml_function_coverage=1 00:06:50.698 --rc genhtml_legend=1 00:06:50.698 --rc geninfo_all_blocks=1 00:06:50.698 --rc geninfo_unexecuted_blocks=1 00:06:50.698 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.698 ' 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:50.698 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.698 --rc genhtml_branch_coverage=1 00:06:50.698 --rc genhtml_function_coverage=1 00:06:50.698 --rc genhtml_legend=1 00:06:50.698 --rc geninfo_all_blocks=1 00:06:50.698 --rc geninfo_unexecuted_blocks=1 00:06:50.698 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.698 ' 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:50.698 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.698 --rc genhtml_branch_coverage=1 00:06:50.698 --rc genhtml_function_coverage=1 00:06:50.698 --rc genhtml_legend=1 00:06:50.698 --rc geninfo_all_blocks=1 00:06:50.698 --rc geninfo_unexecuted_blocks=1 00:06:50.698 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.698 ' 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:50.698 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.698 --rc genhtml_branch_coverage=1 00:06:50.698 --rc genhtml_function_coverage=1 00:06:50.698 --rc genhtml_legend=1 00:06:50.698 --rc geninfo_all_blocks=1 00:06:50.698 --rc geninfo_unexecuted_blocks=1 00:06:50.698 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.698 ' 00:06:50.698 21:31:52 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:50.698 OK 00:06:50.698 21:31:52 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:50.698 00:06:50.698 real 0m0.196s 00:06:50.698 user 0m0.099s 00:06:50.698 sys 0m0.109s 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.698 21:31:52 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:50.698 ************************************ 00:06:50.698 END TEST rpc_client 00:06:50.698 ************************************ 00:06:50.698 21:31:52 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:50.698 21:31:52 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.698 21:31:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.698 21:31:52 -- common/autotest_common.sh@10 -- # set +x 00:06:50.698 ************************************ 00:06:50.698 START TEST json_config 00:06:50.698 ************************************ 00:06:50.698 21:31:52 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:50.698 21:31:52 json_config -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:50.698 21:31:52 json_config -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:50.698 21:31:52 json_config -- common/autotest_common.sh@1689 -- # lcov --version 00:06:50.958 21:31:52 json_config -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:50.958 21:31:52 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.958 21:31:52 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.958 21:31:52 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.958 21:31:52 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.958 21:31:52 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.958 21:31:52 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.958 21:31:52 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.958 21:31:52 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.958 21:31:52 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.958 21:31:52 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.958 21:31:52 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.958 21:31:52 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:50.958 21:31:52 json_config -- scripts/common.sh@345 -- # : 1 00:06:50.958 21:31:52 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.958 21:31:52 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.958 21:31:52 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:50.958 21:31:52 json_config -- scripts/common.sh@353 -- # local d=1 00:06:50.958 21:31:52 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.958 21:31:52 json_config -- scripts/common.sh@355 -- # echo 1 00:06:50.959 21:31:52 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.959 21:31:52 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:50.959 21:31:52 json_config -- scripts/common.sh@353 -- # local d=2 00:06:50.959 21:31:52 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.959 21:31:52 json_config -- scripts/common.sh@355 -- # echo 2 00:06:50.959 21:31:52 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.959 21:31:52 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.959 21:31:52 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.959 21:31:52 json_config -- scripts/common.sh@368 -- # return 0 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:50.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.959 --rc genhtml_branch_coverage=1 00:06:50.959 --rc genhtml_function_coverage=1 00:06:50.959 --rc genhtml_legend=1 00:06:50.959 --rc geninfo_all_blocks=1 00:06:50.959 --rc geninfo_unexecuted_blocks=1 00:06:50.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.959 ' 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:50.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.959 --rc genhtml_branch_coverage=1 00:06:50.959 --rc genhtml_function_coverage=1 00:06:50.959 --rc genhtml_legend=1 00:06:50.959 --rc geninfo_all_blocks=1 00:06:50.959 --rc geninfo_unexecuted_blocks=1 00:06:50.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.959 ' 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:50.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.959 --rc genhtml_branch_coverage=1 00:06:50.959 --rc genhtml_function_coverage=1 00:06:50.959 --rc genhtml_legend=1 00:06:50.959 --rc geninfo_all_blocks=1 00:06:50.959 --rc geninfo_unexecuted_blocks=1 00:06:50.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.959 ' 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:50.959 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.959 --rc genhtml_branch_coverage=1 00:06:50.959 --rc genhtml_function_coverage=1 00:06:50.959 --rc genhtml_legend=1 00:06:50.959 --rc geninfo_all_blocks=1 00:06:50.959 --rc geninfo_unexecuted_blocks=1 00:06:50.959 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.959 ' 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:50.959 21:31:52 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:50.959 21:31:52 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:50.959 21:31:52 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:50.959 21:31:52 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:50.959 21:31:52 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.959 21:31:52 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.959 21:31:52 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.959 21:31:52 json_config -- paths/export.sh@5 -- # export PATH 00:06:50.959 21:31:52 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@51 -- # : 0 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:50.959 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:50.959 21:31:52 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:50.959 WARNING: No tests are enabled so not running JSON configuration tests 00:06:50.959 21:31:52 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:50.959 00:06:50.959 real 0m0.185s 00:06:50.959 user 0m0.103s 00:06:50.959 sys 0m0.089s 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.959 21:31:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.959 ************************************ 00:06:50.959 END TEST json_config 00:06:50.959 ************************************ 00:06:50.959 21:31:52 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:50.959 21:31:52 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.959 21:31:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.959 21:31:52 -- common/autotest_common.sh@10 -- # set +x 00:06:50.959 ************************************ 00:06:50.959 START TEST json_config_extra_key 00:06:50.959 ************************************ 00:06:50.959 21:31:52 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:50.959 21:31:52 json_config_extra_key -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:50.959 21:31:52 json_config_extra_key -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:50.959 21:31:52 json_config_extra_key -- common/autotest_common.sh@1689 -- # lcov --version 00:06:51.219 21:31:52 json_config_extra_key -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:51.219 21:31:52 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:51.219 21:31:52 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:51.219 21:31:52 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:51.219 21:31:52 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:51.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.220 --rc genhtml_branch_coverage=1 00:06:51.220 --rc genhtml_function_coverage=1 00:06:51.220 --rc genhtml_legend=1 00:06:51.220 --rc geninfo_all_blocks=1 00:06:51.220 --rc geninfo_unexecuted_blocks=1 00:06:51.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.220 ' 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:51.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.220 --rc genhtml_branch_coverage=1 00:06:51.220 --rc genhtml_function_coverage=1 00:06:51.220 --rc genhtml_legend=1 00:06:51.220 --rc geninfo_all_blocks=1 00:06:51.220 --rc geninfo_unexecuted_blocks=1 00:06:51.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.220 ' 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:51.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.220 --rc genhtml_branch_coverage=1 00:06:51.220 --rc genhtml_function_coverage=1 00:06:51.220 --rc genhtml_legend=1 00:06:51.220 --rc geninfo_all_blocks=1 00:06:51.220 --rc geninfo_unexecuted_blocks=1 00:06:51.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.220 ' 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:51.220 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.220 --rc genhtml_branch_coverage=1 00:06:51.220 --rc genhtml_function_coverage=1 00:06:51.220 --rc genhtml_legend=1 00:06:51.220 --rc geninfo_all_blocks=1 00:06:51.220 --rc geninfo_unexecuted_blocks=1 00:06:51.220 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.220 ' 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:51.220 21:31:52 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:51.220 21:31:52 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.220 21:31:52 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.220 21:31:52 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.220 21:31:52 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:51.220 21:31:52 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:51.220 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:51.220 21:31:52 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:51.220 INFO: launching applications... 00:06:51.220 21:31:52 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=3309354 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:51.220 Waiting for target to run... 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 3309354 /var/tmp/spdk_tgt.sock 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 3309354 ']' 00:06:51.220 21:31:52 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:51.220 21:31:52 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:51.221 21:31:52 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:51.221 21:31:52 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:51.221 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:51.221 21:31:52 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:51.221 21:31:52 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:51.221 [2024-10-27 21:31:52.812124] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:51.221 [2024-10-27 21:31:52.812191] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3309354 ] 00:06:51.480 [2024-10-27 21:31:53.158073] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:51.480 [2024-10-27 21:31:53.194552] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.738 [2024-10-27 21:31:53.207861] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.997 21:31:53 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:51.997 21:31:53 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:51.997 00:06:51.997 21:31:53 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:51.997 INFO: shutting down applications... 00:06:51.997 21:31:53 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 3309354 ]] 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 3309354 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3309354 00:06:51.997 21:31:53 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3309354 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:52.564 21:31:54 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:52.564 SPDK target shutdown done 00:06:52.564 21:31:54 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:52.564 Success 00:06:52.564 00:06:52.564 real 0m1.581s 00:06:52.564 user 0m1.206s 00:06:52.564 sys 0m0.427s 00:06:52.564 21:31:54 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:52.564 21:31:54 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:52.564 ************************************ 00:06:52.564 END TEST json_config_extra_key 00:06:52.564 ************************************ 00:06:52.564 21:31:54 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:52.564 21:31:54 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:52.564 21:31:54 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:52.564 21:31:54 -- common/autotest_common.sh@10 -- # set +x 00:06:52.564 ************************************ 00:06:52.564 START TEST alias_rpc 00:06:52.564 ************************************ 00:06:52.564 21:31:54 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:52.823 * Looking for test storage... 00:06:52.823 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:52.823 21:31:54 alias_rpc -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:52.823 21:31:54 alias_rpc -- common/autotest_common.sh@1689 -- # lcov --version 00:06:52.823 21:31:54 alias_rpc -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:52.823 21:31:54 alias_rpc -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:52.823 21:31:54 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:52.824 21:31:54 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:52.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.824 --rc genhtml_branch_coverage=1 00:06:52.824 --rc genhtml_function_coverage=1 00:06:52.824 --rc genhtml_legend=1 00:06:52.824 --rc geninfo_all_blocks=1 00:06:52.824 --rc geninfo_unexecuted_blocks=1 00:06:52.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.824 ' 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:52.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.824 --rc genhtml_branch_coverage=1 00:06:52.824 --rc genhtml_function_coverage=1 00:06:52.824 --rc genhtml_legend=1 00:06:52.824 --rc geninfo_all_blocks=1 00:06:52.824 --rc geninfo_unexecuted_blocks=1 00:06:52.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.824 ' 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:52.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.824 --rc genhtml_branch_coverage=1 00:06:52.824 --rc genhtml_function_coverage=1 00:06:52.824 --rc genhtml_legend=1 00:06:52.824 --rc geninfo_all_blocks=1 00:06:52.824 --rc geninfo_unexecuted_blocks=1 00:06:52.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.824 ' 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:52.824 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:52.824 --rc genhtml_branch_coverage=1 00:06:52.824 --rc genhtml_function_coverage=1 00:06:52.824 --rc genhtml_legend=1 00:06:52.824 --rc geninfo_all_blocks=1 00:06:52.824 --rc geninfo_unexecuted_blocks=1 00:06:52.824 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:52.824 ' 00:06:52.824 21:31:54 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:52.824 21:31:54 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3309724 00:06:52.824 21:31:54 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.824 21:31:54 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3309724 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 3309724 ']' 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:52.824 21:31:54 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.824 [2024-10-27 21:31:54.468812] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:52.824 [2024-10-27 21:31:54.468901] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3309724 ] 00:06:53.082 [2024-10-27 21:31:54.604911] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:53.082 [2024-10-27 21:31:54.640729] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.082 [2024-10-27 21:31:54.662633] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.648 21:31:55 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:53.648 21:31:55 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:06:53.648 21:31:55 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:53.905 21:31:55 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3309724 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 3309724 ']' 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 3309724 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3309724 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:53.905 21:31:55 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3309724' 00:06:53.905 killing process with pid 3309724 00:06:53.906 21:31:55 alias_rpc -- common/autotest_common.sh@969 -- # kill 3309724 00:06:53.906 21:31:55 alias_rpc -- common/autotest_common.sh@974 -- # wait 3309724 00:06:54.164 00:06:54.164 real 0m1.633s 00:06:54.164 user 0m1.667s 00:06:54.164 sys 0m0.475s 00:06:54.164 21:31:55 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:54.164 21:31:55 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.164 ************************************ 00:06:54.164 END TEST alias_rpc 00:06:54.164 ************************************ 00:06:54.422 21:31:55 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:54.422 21:31:55 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:54.422 21:31:55 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:54.422 21:31:55 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:54.422 21:31:55 -- common/autotest_common.sh@10 -- # set +x 00:06:54.422 ************************************ 00:06:54.422 START TEST spdkcli_tcp 00:06:54.422 ************************************ 00:06:54.422 21:31:55 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:54.422 * Looking for test storage... 00:06:54.422 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:54.422 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:54.422 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1689 -- # lcov --version 00:06:54.422 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:54.422 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:54.422 21:31:56 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:54.423 21:31:56 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:54.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.423 --rc genhtml_branch_coverage=1 00:06:54.423 --rc genhtml_function_coverage=1 00:06:54.423 --rc genhtml_legend=1 00:06:54.423 --rc geninfo_all_blocks=1 00:06:54.423 --rc geninfo_unexecuted_blocks=1 00:06:54.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.423 ' 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:54.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.423 --rc genhtml_branch_coverage=1 00:06:54.423 --rc genhtml_function_coverage=1 00:06:54.423 --rc genhtml_legend=1 00:06:54.423 --rc geninfo_all_blocks=1 00:06:54.423 --rc geninfo_unexecuted_blocks=1 00:06:54.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.423 ' 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:54.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.423 --rc genhtml_branch_coverage=1 00:06:54.423 --rc genhtml_function_coverage=1 00:06:54.423 --rc genhtml_legend=1 00:06:54.423 --rc geninfo_all_blocks=1 00:06:54.423 --rc geninfo_unexecuted_blocks=1 00:06:54.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.423 ' 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:54.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.423 --rc genhtml_branch_coverage=1 00:06:54.423 --rc genhtml_function_coverage=1 00:06:54.423 --rc genhtml_legend=1 00:06:54.423 --rc geninfo_all_blocks=1 00:06:54.423 --rc geninfo_unexecuted_blocks=1 00:06:54.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.423 ' 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3310079 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:54.423 21:31:56 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 3310079 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 3310079 ']' 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:54.423 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:54.423 21:31:56 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:54.682 [2024-10-27 21:31:56.158899] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:54.682 [2024-10-27 21:31:56.158969] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3310079 ] 00:06:54.682 [2024-10-27 21:31:56.294353] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:54.682 [2024-10-27 21:31:56.329016] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:54.682 [2024-10-27 21:31:56.352342] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:54.682 [2024-10-27 21:31:56.352346] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.619 21:31:57 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:55.619 21:31:57 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:06:55.619 21:31:57 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=3310096 00:06:55.620 21:31:57 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:55.620 21:31:57 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:55.620 [ 00:06:55.620 "spdk_get_version", 00:06:55.620 "rpc_get_methods", 00:06:55.620 "notify_get_notifications", 00:06:55.620 "notify_get_types", 00:06:55.620 "trace_get_info", 00:06:55.620 "trace_get_tpoint_group_mask", 00:06:55.620 "trace_disable_tpoint_group", 00:06:55.620 "trace_enable_tpoint_group", 00:06:55.620 "trace_clear_tpoint_mask", 00:06:55.620 "trace_set_tpoint_mask", 00:06:55.620 "fsdev_set_opts", 00:06:55.620 "fsdev_get_opts", 00:06:55.620 "framework_get_pci_devices", 00:06:55.620 "framework_get_config", 00:06:55.620 "framework_get_subsystems", 00:06:55.620 "vfu_tgt_set_base_path", 00:06:55.620 "keyring_get_keys", 00:06:55.620 "iobuf_get_stats", 00:06:55.620 "iobuf_set_options", 00:06:55.620 "sock_get_default_impl", 00:06:55.620 "sock_set_default_impl", 00:06:55.620 "sock_impl_set_options", 00:06:55.620 "sock_impl_get_options", 00:06:55.620 "vmd_rescan", 00:06:55.620 "vmd_remove_device", 00:06:55.620 "vmd_enable", 00:06:55.620 "accel_get_stats", 00:06:55.620 "accel_set_options", 00:06:55.620 "accel_set_driver", 00:06:55.620 "accel_crypto_key_destroy", 00:06:55.620 "accel_crypto_keys_get", 00:06:55.620 "accel_crypto_key_create", 00:06:55.620 "accel_assign_opc", 00:06:55.620 "accel_get_module_info", 00:06:55.620 "accel_get_opc_assignments", 00:06:55.620 "bdev_get_histogram", 00:06:55.620 "bdev_enable_histogram", 00:06:55.620 "bdev_set_qos_limit", 00:06:55.620 "bdev_set_qd_sampling_period", 00:06:55.620 "bdev_get_bdevs", 00:06:55.620 "bdev_reset_iostat", 00:06:55.620 "bdev_get_iostat", 00:06:55.620 "bdev_examine", 00:06:55.620 "bdev_wait_for_examine", 00:06:55.620 "bdev_set_options", 00:06:55.620 "scsi_get_devices", 00:06:55.620 "thread_set_cpumask", 00:06:55.620 "scheduler_set_options", 00:06:55.620 "framework_get_governor", 00:06:55.620 "framework_get_scheduler", 00:06:55.620 "framework_set_scheduler", 00:06:55.620 "framework_get_reactors", 00:06:55.620 "thread_get_io_channels", 00:06:55.620 "thread_get_pollers", 00:06:55.620 "thread_get_stats", 00:06:55.620 "framework_monitor_context_switch", 00:06:55.620 "spdk_kill_instance", 00:06:55.620 "log_enable_timestamps", 00:06:55.620 "log_get_flags", 00:06:55.620 "log_clear_flag", 00:06:55.620 "log_set_flag", 00:06:55.620 "log_get_level", 00:06:55.620 "log_set_level", 00:06:55.620 "log_get_print_level", 00:06:55.620 "log_set_print_level", 00:06:55.620 "framework_enable_cpumask_locks", 00:06:55.620 "framework_disable_cpumask_locks", 00:06:55.620 "framework_wait_init", 00:06:55.620 "framework_start_init", 00:06:55.620 "virtio_blk_create_transport", 00:06:55.620 "virtio_blk_get_transports", 00:06:55.620 "vhost_controller_set_coalescing", 00:06:55.620 "vhost_get_controllers", 00:06:55.620 "vhost_delete_controller", 00:06:55.620 "vhost_create_blk_controller", 00:06:55.620 "vhost_scsi_controller_remove_target", 00:06:55.620 "vhost_scsi_controller_add_target", 00:06:55.620 "vhost_start_scsi_controller", 00:06:55.620 "vhost_create_scsi_controller", 00:06:55.620 "ublk_recover_disk", 00:06:55.620 "ublk_get_disks", 00:06:55.620 "ublk_stop_disk", 00:06:55.620 "ublk_start_disk", 00:06:55.620 "ublk_destroy_target", 00:06:55.620 "ublk_create_target", 00:06:55.620 "nbd_get_disks", 00:06:55.620 "nbd_stop_disk", 00:06:55.620 "nbd_start_disk", 00:06:55.620 "env_dpdk_get_mem_stats", 00:06:55.620 "nvmf_stop_mdns_prr", 00:06:55.620 "nvmf_publish_mdns_prr", 00:06:55.620 "nvmf_subsystem_get_listeners", 00:06:55.620 "nvmf_subsystem_get_qpairs", 00:06:55.620 "nvmf_subsystem_get_controllers", 00:06:55.620 "nvmf_get_stats", 00:06:55.620 "nvmf_get_transports", 00:06:55.620 "nvmf_create_transport", 00:06:55.620 "nvmf_get_targets", 00:06:55.620 "nvmf_delete_target", 00:06:55.620 "nvmf_create_target", 00:06:55.620 "nvmf_subsystem_allow_any_host", 00:06:55.620 "nvmf_subsystem_set_keys", 00:06:55.620 "nvmf_subsystem_remove_host", 00:06:55.620 "nvmf_subsystem_add_host", 00:06:55.620 "nvmf_ns_remove_host", 00:06:55.620 "nvmf_ns_add_host", 00:06:55.620 "nvmf_subsystem_remove_ns", 00:06:55.620 "nvmf_subsystem_set_ns_ana_group", 00:06:55.620 "nvmf_subsystem_add_ns", 00:06:55.620 "nvmf_subsystem_listener_set_ana_state", 00:06:55.620 "nvmf_discovery_get_referrals", 00:06:55.620 "nvmf_discovery_remove_referral", 00:06:55.620 "nvmf_discovery_add_referral", 00:06:55.620 "nvmf_subsystem_remove_listener", 00:06:55.620 "nvmf_subsystem_add_listener", 00:06:55.620 "nvmf_delete_subsystem", 00:06:55.620 "nvmf_create_subsystem", 00:06:55.620 "nvmf_get_subsystems", 00:06:55.620 "nvmf_set_crdt", 00:06:55.620 "nvmf_set_config", 00:06:55.620 "nvmf_set_max_subsystems", 00:06:55.620 "iscsi_get_histogram", 00:06:55.620 "iscsi_enable_histogram", 00:06:55.620 "iscsi_set_options", 00:06:55.620 "iscsi_get_auth_groups", 00:06:55.620 "iscsi_auth_group_remove_secret", 00:06:55.620 "iscsi_auth_group_add_secret", 00:06:55.620 "iscsi_delete_auth_group", 00:06:55.620 "iscsi_create_auth_group", 00:06:55.620 "iscsi_set_discovery_auth", 00:06:55.620 "iscsi_get_options", 00:06:55.620 "iscsi_target_node_request_logout", 00:06:55.620 "iscsi_target_node_set_redirect", 00:06:55.620 "iscsi_target_node_set_auth", 00:06:55.620 "iscsi_target_node_add_lun", 00:06:55.620 "iscsi_get_stats", 00:06:55.620 "iscsi_get_connections", 00:06:55.620 "iscsi_portal_group_set_auth", 00:06:55.620 "iscsi_start_portal_group", 00:06:55.620 "iscsi_delete_portal_group", 00:06:55.620 "iscsi_create_portal_group", 00:06:55.620 "iscsi_get_portal_groups", 00:06:55.620 "iscsi_delete_target_node", 00:06:55.620 "iscsi_target_node_remove_pg_ig_maps", 00:06:55.620 "iscsi_target_node_add_pg_ig_maps", 00:06:55.620 "iscsi_create_target_node", 00:06:55.620 "iscsi_get_target_nodes", 00:06:55.620 "iscsi_delete_initiator_group", 00:06:55.620 "iscsi_initiator_group_remove_initiators", 00:06:55.620 "iscsi_initiator_group_add_initiators", 00:06:55.620 "iscsi_create_initiator_group", 00:06:55.620 "iscsi_get_initiator_groups", 00:06:55.620 "fsdev_aio_delete", 00:06:55.620 "fsdev_aio_create", 00:06:55.620 "keyring_linux_set_options", 00:06:55.620 "keyring_file_remove_key", 00:06:55.620 "keyring_file_add_key", 00:06:55.620 "vfu_virtio_create_fs_endpoint", 00:06:55.620 "vfu_virtio_create_scsi_endpoint", 00:06:55.620 "vfu_virtio_scsi_remove_target", 00:06:55.620 "vfu_virtio_scsi_add_target", 00:06:55.620 "vfu_virtio_create_blk_endpoint", 00:06:55.620 "vfu_virtio_delete_endpoint", 00:06:55.620 "iaa_scan_accel_module", 00:06:55.620 "dsa_scan_accel_module", 00:06:55.620 "ioat_scan_accel_module", 00:06:55.620 "accel_error_inject_error", 00:06:55.620 "bdev_iscsi_delete", 00:06:55.620 "bdev_iscsi_create", 00:06:55.620 "bdev_iscsi_set_options", 00:06:55.620 "bdev_virtio_attach_controller", 00:06:55.620 "bdev_virtio_scsi_get_devices", 00:06:55.620 "bdev_virtio_detach_controller", 00:06:55.620 "bdev_virtio_blk_set_hotplug", 00:06:55.620 "bdev_ftl_set_property", 00:06:55.620 "bdev_ftl_get_properties", 00:06:55.620 "bdev_ftl_get_stats", 00:06:55.620 "bdev_ftl_unmap", 00:06:55.620 "bdev_ftl_unload", 00:06:55.620 "bdev_ftl_delete", 00:06:55.620 "bdev_ftl_load", 00:06:55.620 "bdev_ftl_create", 00:06:55.620 "bdev_aio_delete", 00:06:55.620 "bdev_aio_rescan", 00:06:55.620 "bdev_aio_create", 00:06:55.620 "blobfs_create", 00:06:55.620 "blobfs_detect", 00:06:55.620 "blobfs_set_cache_size", 00:06:55.620 "bdev_zone_block_delete", 00:06:55.620 "bdev_zone_block_create", 00:06:55.620 "bdev_delay_delete", 00:06:55.620 "bdev_delay_create", 00:06:55.620 "bdev_delay_update_latency", 00:06:55.620 "bdev_split_delete", 00:06:55.620 "bdev_split_create", 00:06:55.620 "bdev_error_inject_error", 00:06:55.620 "bdev_error_delete", 00:06:55.620 "bdev_error_create", 00:06:55.620 "bdev_raid_set_options", 00:06:55.620 "bdev_raid_remove_base_bdev", 00:06:55.620 "bdev_raid_add_base_bdev", 00:06:55.620 "bdev_raid_delete", 00:06:55.620 "bdev_raid_create", 00:06:55.620 "bdev_raid_get_bdevs", 00:06:55.620 "bdev_lvol_set_parent_bdev", 00:06:55.620 "bdev_lvol_set_parent", 00:06:55.620 "bdev_lvol_check_shallow_copy", 00:06:55.620 "bdev_lvol_start_shallow_copy", 00:06:55.620 "bdev_lvol_grow_lvstore", 00:06:55.620 "bdev_lvol_get_lvols", 00:06:55.620 "bdev_lvol_get_lvstores", 00:06:55.620 "bdev_lvol_delete", 00:06:55.620 "bdev_lvol_set_read_only", 00:06:55.620 "bdev_lvol_resize", 00:06:55.620 "bdev_lvol_decouple_parent", 00:06:55.620 "bdev_lvol_inflate", 00:06:55.620 "bdev_lvol_rename", 00:06:55.620 "bdev_lvol_clone_bdev", 00:06:55.620 "bdev_lvol_clone", 00:06:55.620 "bdev_lvol_snapshot", 00:06:55.620 "bdev_lvol_create", 00:06:55.620 "bdev_lvol_delete_lvstore", 00:06:55.620 "bdev_lvol_rename_lvstore", 00:06:55.620 "bdev_lvol_create_lvstore", 00:06:55.620 "bdev_passthru_delete", 00:06:55.620 "bdev_passthru_create", 00:06:55.620 "bdev_nvme_cuse_unregister", 00:06:55.620 "bdev_nvme_cuse_register", 00:06:55.620 "bdev_opal_new_user", 00:06:55.620 "bdev_opal_set_lock_state", 00:06:55.620 "bdev_opal_delete", 00:06:55.620 "bdev_opal_get_info", 00:06:55.620 "bdev_opal_create", 00:06:55.620 "bdev_nvme_opal_revert", 00:06:55.620 "bdev_nvme_opal_init", 00:06:55.620 "bdev_nvme_send_cmd", 00:06:55.620 "bdev_nvme_set_keys", 00:06:55.620 "bdev_nvme_get_path_iostat", 00:06:55.620 "bdev_nvme_get_mdns_discovery_info", 00:06:55.620 "bdev_nvme_stop_mdns_discovery", 00:06:55.621 "bdev_nvme_start_mdns_discovery", 00:06:55.621 "bdev_nvme_set_multipath_policy", 00:06:55.621 "bdev_nvme_set_preferred_path", 00:06:55.621 "bdev_nvme_get_io_paths", 00:06:55.621 "bdev_nvme_remove_error_injection", 00:06:55.621 "bdev_nvme_add_error_injection", 00:06:55.621 "bdev_nvme_get_discovery_info", 00:06:55.621 "bdev_nvme_stop_discovery", 00:06:55.621 "bdev_nvme_start_discovery", 00:06:55.621 "bdev_nvme_get_controller_health_info", 00:06:55.621 "bdev_nvme_disable_controller", 00:06:55.621 "bdev_nvme_enable_controller", 00:06:55.621 "bdev_nvme_reset_controller", 00:06:55.621 "bdev_nvme_get_transport_statistics", 00:06:55.621 "bdev_nvme_apply_firmware", 00:06:55.621 "bdev_nvme_detach_controller", 00:06:55.621 "bdev_nvme_get_controllers", 00:06:55.621 "bdev_nvme_attach_controller", 00:06:55.621 "bdev_nvme_set_hotplug", 00:06:55.621 "bdev_nvme_set_options", 00:06:55.621 "bdev_null_resize", 00:06:55.621 "bdev_null_delete", 00:06:55.621 "bdev_null_create", 00:06:55.621 "bdev_malloc_delete", 00:06:55.621 "bdev_malloc_create" 00:06:55.621 ] 00:06:55.621 21:31:57 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:55.621 21:31:57 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:55.621 21:31:57 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 3310079 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 3310079 ']' 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 3310079 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3310079 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3310079' 00:06:55.621 killing process with pid 3310079 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 3310079 00:06:55.621 21:31:57 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 3310079 00:06:55.880 00:06:55.880 real 0m1.625s 00:06:55.880 user 0m2.836s 00:06:55.880 sys 0m0.528s 00:06:55.880 21:31:57 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:55.880 21:31:57 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:55.880 ************************************ 00:06:55.880 END TEST spdkcli_tcp 00:06:55.880 ************************************ 00:06:56.140 21:31:57 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:56.140 21:31:57 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:56.140 21:31:57 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:56.140 21:31:57 -- common/autotest_common.sh@10 -- # set +x 00:06:56.140 ************************************ 00:06:56.140 START TEST dpdk_mem_utility 00:06:56.140 ************************************ 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:56.140 * Looking for test storage... 00:06:56.140 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1689 -- # lcov --version 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:56.140 21:31:57 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:56.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.140 --rc genhtml_branch_coverage=1 00:06:56.140 --rc genhtml_function_coverage=1 00:06:56.140 --rc genhtml_legend=1 00:06:56.140 --rc geninfo_all_blocks=1 00:06:56.140 --rc geninfo_unexecuted_blocks=1 00:06:56.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.140 ' 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:56.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.140 --rc genhtml_branch_coverage=1 00:06:56.140 --rc genhtml_function_coverage=1 00:06:56.140 --rc genhtml_legend=1 00:06:56.140 --rc geninfo_all_blocks=1 00:06:56.140 --rc geninfo_unexecuted_blocks=1 00:06:56.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.140 ' 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:56.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.140 --rc genhtml_branch_coverage=1 00:06:56.140 --rc genhtml_function_coverage=1 00:06:56.140 --rc genhtml_legend=1 00:06:56.140 --rc geninfo_all_blocks=1 00:06:56.140 --rc geninfo_unexecuted_blocks=1 00:06:56.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.140 ' 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:56.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:56.140 --rc genhtml_branch_coverage=1 00:06:56.140 --rc genhtml_function_coverage=1 00:06:56.140 --rc genhtml_legend=1 00:06:56.140 --rc geninfo_all_blocks=1 00:06:56.140 --rc geninfo_unexecuted_blocks=1 00:06:56.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:56.140 ' 00:06:56.140 21:31:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:56.140 21:31:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3310423 00:06:56.140 21:31:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3310423 00:06:56.140 21:31:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 3310423 ']' 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:56.140 21:31:57 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:56.400 [2024-10-27 21:31:57.878358] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:56.400 [2024-10-27 21:31:57.878448] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3310423 ] 00:06:56.400 [2024-10-27 21:31:58.030572] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:56.400 [2024-10-27 21:31:58.064898] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.400 [2024-10-27 21:31:58.086292] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.339 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:57.339 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:06:57.339 21:31:58 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:57.339 21:31:58 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:57.339 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:57.339 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:57.339 { 00:06:57.339 "filename": "/tmp/spdk_mem_dump.txt" 00:06:57.339 } 00:06:57.339 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:57.339 21:31:58 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:57.339 DPDK memory size 810.000000 MiB in 1 heap(s) 00:06:57.339 1 heaps totaling size 810.000000 MiB 00:06:57.339 size: 810.000000 MiB heap id: 0 00:06:57.339 end heaps---------- 00:06:57.339 9 mempools totaling size 595.772034 MiB 00:06:57.339 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:57.339 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:57.339 size: 92.545471 MiB name: bdev_io_3310423 00:06:57.339 size: 50.003479 MiB name: msgpool_3310423 00:06:57.339 size: 36.509338 MiB name: fsdev_io_3310423 00:06:57.339 size: 21.763794 MiB name: PDU_Pool 00:06:57.339 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:57.339 size: 4.133484 MiB name: evtpool_3310423 00:06:57.339 size: 0.026123 MiB name: Session_Pool 00:06:57.339 end mempools------- 00:06:57.339 6 memzones totaling size 4.142822 MiB 00:06:57.339 size: 1.000366 MiB name: RG_ring_0_3310423 00:06:57.339 size: 1.000366 MiB name: RG_ring_1_3310423 00:06:57.339 size: 1.000366 MiB name: RG_ring_4_3310423 00:06:57.339 size: 1.000366 MiB name: RG_ring_5_3310423 00:06:57.339 size: 0.125366 MiB name: RG_ring_2_3310423 00:06:57.339 size: 0.015991 MiB name: RG_ring_3_3310423 00:06:57.339 end memzones------- 00:06:57.339 21:31:58 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:57.339 heap id: 0 total size: 810.000000 MiB number of busy elements: 44 number of free elements: 15 00:06:57.339 list of free elements. size: 10.745300 MiB 00:06:57.339 element at address: 0x200018a00000 with size: 0.999878 MiB 00:06:57.339 element at address: 0x200018c00000 with size: 0.999878 MiB 00:06:57.339 element at address: 0x200000400000 with size: 0.998535 MiB 00:06:57.339 element at address: 0x200031800000 with size: 0.994446 MiB 00:06:57.340 element at address: 0x200008000000 with size: 0.959839 MiB 00:06:57.340 element at address: 0x200012c00000 with size: 0.954285 MiB 00:06:57.340 element at address: 0x200018e00000 with size: 0.936584 MiB 00:06:57.340 element at address: 0x200000200000 with size: 0.600159 MiB 00:06:57.340 element at address: 0x20001a600000 with size: 0.582886 MiB 00:06:57.340 element at address: 0x200000c00000 with size: 0.495422 MiB 00:06:57.340 element at address: 0x200003e00000 with size: 0.490723 MiB 00:06:57.340 element at address: 0x200019000000 with size: 0.485657 MiB 00:06:57.340 element at address: 0x200010600000 with size: 0.481934 MiB 00:06:57.340 element at address: 0x200027a00000 with size: 0.410034 MiB 00:06:57.340 element at address: 0x200000800000 with size: 0.355042 MiB 00:06:57.340 list of standard malloc elements. size: 199.335815 MiB 00:06:57.340 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:06:57.340 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:06:57.340 element at address: 0x200018afff80 with size: 1.000122 MiB 00:06:57.340 element at address: 0x200018cfff80 with size: 1.000122 MiB 00:06:57.340 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:57.340 element at address: 0x2000003bbf00 with size: 0.257935 MiB 00:06:57.340 element at address: 0x200018eeff00 with size: 0.062622 MiB 00:06:57.340 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:57.340 element at address: 0x200018eefdc0 with size: 0.000305 MiB 00:06:57.340 element at address: 0x2000002b9c40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000003bbe40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20000085b040 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20000085b100 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000008df880 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200000cff000 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20001067b600 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200012cf44c0 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200018eefc40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200018eefd00 with size: 0.000183 MiB 00:06:57.340 element at address: 0x2000190bc740 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20001a695380 with size: 0.000183 MiB 00:06:57.340 element at address: 0x20001a695440 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200027a68f80 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200027a69040 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200027a6fc40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200027a6fe40 with size: 0.000183 MiB 00:06:57.340 element at address: 0x200027a6ff00 with size: 0.000183 MiB 00:06:57.340 list of memzone associated elements. size: 599.918884 MiB 00:06:57.340 element at address: 0x20001a695500 with size: 211.416748 MiB 00:06:57.340 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:57.340 element at address: 0x200027a6ffc0 with size: 157.562561 MiB 00:06:57.340 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:57.340 element at address: 0x200012df4780 with size: 92.045044 MiB 00:06:57.340 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_3310423_0 00:06:57.340 element at address: 0x200000dff380 with size: 48.003052 MiB 00:06:57.340 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3310423_0 00:06:57.340 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:06:57.340 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_3310423_0 00:06:57.340 element at address: 0x2000191be940 with size: 20.255554 MiB 00:06:57.340 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:57.340 element at address: 0x2000319feb40 with size: 18.005066 MiB 00:06:57.340 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:57.340 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:06:57.340 associated memzone info: size: 3.000122 MiB name: MP_evtpool_3310423_0 00:06:57.340 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:06:57.340 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3310423 00:06:57.340 element at address: 0x2000002b9d00 with size: 1.008118 MiB 00:06:57.340 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3310423 00:06:57.340 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:06:57.340 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:57.340 element at address: 0x2000190bc800 with size: 1.008118 MiB 00:06:57.340 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:57.340 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:06:57.340 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:57.340 element at address: 0x200003efde40 with size: 1.008118 MiB 00:06:57.340 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:57.340 element at address: 0x200000cff180 with size: 1.000488 MiB 00:06:57.340 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3310423 00:06:57.340 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:06:57.340 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3310423 00:06:57.340 element at address: 0x200012cf4580 with size: 1.000488 MiB 00:06:57.340 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3310423 00:06:57.340 element at address: 0x2000318fe940 with size: 1.000488 MiB 00:06:57.340 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3310423 00:06:57.340 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:06:57.340 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_3310423 00:06:57.340 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:06:57.340 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3310423 00:06:57.340 element at address: 0x20001067b780 with size: 0.500488 MiB 00:06:57.340 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:57.340 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:06:57.340 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:57.340 element at address: 0x20001907c540 with size: 0.250488 MiB 00:06:57.340 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:57.340 element at address: 0x200000299a40 with size: 0.125488 MiB 00:06:57.340 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_3310423 00:06:57.340 element at address: 0x2000008df940 with size: 0.125488 MiB 00:06:57.340 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3310423 00:06:57.340 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:06:57.340 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:57.340 element at address: 0x200027a69100 with size: 0.023743 MiB 00:06:57.340 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:57.340 element at address: 0x2000008db680 with size: 0.016113 MiB 00:06:57.340 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3310423 00:06:57.340 element at address: 0x200027a6f240 with size: 0.002441 MiB 00:06:57.340 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:57.340 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:06:57.340 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3310423 00:06:57.340 element at address: 0x2000008db480 with size: 0.000305 MiB 00:06:57.340 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_3310423 00:06:57.340 element at address: 0x20000085af00 with size: 0.000305 MiB 00:06:57.340 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3310423 00:06:57.340 element at address: 0x200027a6fd00 with size: 0.000305 MiB 00:06:57.340 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:57.340 21:31:58 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:57.340 21:31:58 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3310423 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 3310423 ']' 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 3310423 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3310423 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3310423' 00:06:57.340 killing process with pid 3310423 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 3310423 00:06:57.340 21:31:58 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 3310423 00:06:57.600 00:06:57.600 real 0m1.501s 00:06:57.600 user 0m1.433s 00:06:57.600 sys 0m0.474s 00:06:57.600 21:31:59 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:57.600 21:31:59 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:57.600 ************************************ 00:06:57.600 END TEST dpdk_mem_utility 00:06:57.600 ************************************ 00:06:57.600 21:31:59 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:57.600 21:31:59 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:57.600 21:31:59 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:57.600 21:31:59 -- common/autotest_common.sh@10 -- # set +x 00:06:57.600 ************************************ 00:06:57.600 START TEST event 00:06:57.600 ************************************ 00:06:57.600 21:31:59 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:57.860 * Looking for test storage... 00:06:57.860 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:57.860 21:31:59 event -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:06:57.860 21:31:59 event -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:06:57.860 21:31:59 event -- common/autotest_common.sh@1689 -- # lcov --version 00:06:57.860 21:31:59 event -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:06:57.860 21:31:59 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:57.860 21:31:59 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:57.860 21:31:59 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:57.860 21:31:59 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:57.860 21:31:59 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:57.860 21:31:59 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:57.860 21:31:59 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:57.860 21:31:59 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:57.860 21:31:59 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:57.860 21:31:59 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:57.860 21:31:59 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:57.860 21:31:59 event -- scripts/common.sh@344 -- # case "$op" in 00:06:57.860 21:31:59 event -- scripts/common.sh@345 -- # : 1 00:06:57.860 21:31:59 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:57.860 21:31:59 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:57.860 21:31:59 event -- scripts/common.sh@365 -- # decimal 1 00:06:57.860 21:31:59 event -- scripts/common.sh@353 -- # local d=1 00:06:57.860 21:31:59 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:57.860 21:31:59 event -- scripts/common.sh@355 -- # echo 1 00:06:57.860 21:31:59 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:57.860 21:31:59 event -- scripts/common.sh@366 -- # decimal 2 00:06:57.860 21:31:59 event -- scripts/common.sh@353 -- # local d=2 00:06:57.860 21:31:59 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:57.860 21:31:59 event -- scripts/common.sh@355 -- # echo 2 00:06:57.861 21:31:59 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:57.861 21:31:59 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:57.861 21:31:59 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:57.861 21:31:59 event -- scripts/common.sh@368 -- # return 0 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:06:57.861 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:57.861 --rc genhtml_branch_coverage=1 00:06:57.861 --rc genhtml_function_coverage=1 00:06:57.861 --rc genhtml_legend=1 00:06:57.861 --rc geninfo_all_blocks=1 00:06:57.861 --rc geninfo_unexecuted_blocks=1 00:06:57.861 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:57.861 ' 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:06:57.861 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:57.861 --rc genhtml_branch_coverage=1 00:06:57.861 --rc genhtml_function_coverage=1 00:06:57.861 --rc genhtml_legend=1 00:06:57.861 --rc geninfo_all_blocks=1 00:06:57.861 --rc geninfo_unexecuted_blocks=1 00:06:57.861 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:57.861 ' 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:06:57.861 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:57.861 --rc genhtml_branch_coverage=1 00:06:57.861 --rc genhtml_function_coverage=1 00:06:57.861 --rc genhtml_legend=1 00:06:57.861 --rc geninfo_all_blocks=1 00:06:57.861 --rc geninfo_unexecuted_blocks=1 00:06:57.861 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:57.861 ' 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:06:57.861 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:57.861 --rc genhtml_branch_coverage=1 00:06:57.861 --rc genhtml_function_coverage=1 00:06:57.861 --rc genhtml_legend=1 00:06:57.861 --rc geninfo_all_blocks=1 00:06:57.861 --rc geninfo_unexecuted_blocks=1 00:06:57.861 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:57.861 ' 00:06:57.861 21:31:59 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:57.861 21:31:59 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:57.861 21:31:59 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:57.861 21:31:59 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:57.861 21:31:59 event -- common/autotest_common.sh@10 -- # set +x 00:06:57.861 ************************************ 00:06:57.861 START TEST event_perf 00:06:57.861 ************************************ 00:06:57.861 21:31:59 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:57.861 Running I/O for 1 seconds...[2024-10-27 21:31:59.499496] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:57.861 [2024-10-27 21:31:59.499577] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3310767 ] 00:06:58.120 [2024-10-27 21:31:59.640558] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:58.120 [2024-10-27 21:31:59.674487] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:58.120 [2024-10-27 21:31:59.701253] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:58.120 [2024-10-27 21:31:59.701351] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:58.120 [2024-10-27 21:31:59.701439] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:06:58.120 [2024-10-27 21:31:59.701441] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.057 Running I/O for 1 seconds... 00:06:59.057 lcore 0: 197342 00:06:59.057 lcore 1: 197342 00:06:59.057 lcore 2: 197345 00:06:59.057 lcore 3: 197343 00:06:59.057 done. 00:06:59.057 00:06:59.057 real 0m1.256s 00:06:59.057 user 0m4.055s 00:06:59.057 sys 0m0.091s 00:06:59.057 21:32:00 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:59.057 21:32:00 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:59.057 ************************************ 00:06:59.057 END TEST event_perf 00:06:59.057 ************************************ 00:06:59.057 21:32:00 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:59.057 21:32:00 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:59.057 21:32:00 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:59.057 21:32:00 event -- common/autotest_common.sh@10 -- # set +x 00:06:59.315 ************************************ 00:06:59.315 START TEST event_reactor 00:06:59.315 ************************************ 00:06:59.315 21:32:00 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:59.315 [2024-10-27 21:32:00.839138] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:06:59.315 [2024-10-27 21:32:00.839223] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3311050 ] 00:06:59.315 [2024-10-27 21:32:00.979069] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:59.315 [2024-10-27 21:32:01.013428] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.315 [2024-10-27 21:32:01.036059] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.698 test_start 00:07:00.698 oneshot 00:07:00.698 tick 100 00:07:00.698 tick 100 00:07:00.698 tick 250 00:07:00.698 tick 100 00:07:00.698 tick 100 00:07:00.698 tick 100 00:07:00.698 tick 250 00:07:00.698 tick 500 00:07:00.698 tick 100 00:07:00.698 tick 100 00:07:00.698 tick 250 00:07:00.698 tick 100 00:07:00.698 tick 100 00:07:00.698 test_end 00:07:00.698 00:07:00.698 real 0m1.240s 00:07:00.698 user 0m1.052s 00:07:00.698 sys 0m0.084s 00:07:00.698 21:32:02 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:00.698 21:32:02 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:00.698 ************************************ 00:07:00.698 END TEST event_reactor 00:07:00.698 ************************************ 00:07:00.698 21:32:02 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:00.698 21:32:02 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:00.698 21:32:02 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:00.698 21:32:02 event -- common/autotest_common.sh@10 -- # set +x 00:07:00.698 ************************************ 00:07:00.698 START TEST event_reactor_perf 00:07:00.698 ************************************ 00:07:00.698 21:32:02 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:00.698 [2024-10-27 21:32:02.163620] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:00.698 [2024-10-27 21:32:02.163703] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3311277 ] 00:07:00.698 [2024-10-27 21:32:02.302012] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:00.698 [2024-10-27 21:32:02.337599] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.698 [2024-10-27 21:32:02.362221] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.711 test_start 00:07:01.711 test_end 00:07:01.711 Performance: 976258 events per second 00:07:01.711 00:07:01.711 real 0m1.243s 00:07:01.711 user 0m1.061s 00:07:01.711 sys 0m0.078s 00:07:01.711 21:32:03 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:01.711 21:32:03 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:01.711 ************************************ 00:07:01.711 END TEST event_reactor_perf 00:07:01.711 ************************************ 00:07:02.015 21:32:03 event -- event/event.sh@49 -- # uname -s 00:07:02.015 21:32:03 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:02.015 21:32:03 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:02.015 21:32:03 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:02.016 21:32:03 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.016 21:32:03 event -- common/autotest_common.sh@10 -- # set +x 00:07:02.016 ************************************ 00:07:02.016 START TEST event_scheduler 00:07:02.016 ************************************ 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:02.016 * Looking for test storage... 00:07:02.016 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1689 -- # lcov --version 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:02.016 21:32:03 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:02.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:02.016 --rc genhtml_branch_coverage=1 00:07:02.016 --rc genhtml_function_coverage=1 00:07:02.016 --rc genhtml_legend=1 00:07:02.016 --rc geninfo_all_blocks=1 00:07:02.016 --rc geninfo_unexecuted_blocks=1 00:07:02.016 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:02.016 ' 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:02.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:02.016 --rc genhtml_branch_coverage=1 00:07:02.016 --rc genhtml_function_coverage=1 00:07:02.016 --rc genhtml_legend=1 00:07:02.016 --rc geninfo_all_blocks=1 00:07:02.016 --rc geninfo_unexecuted_blocks=1 00:07:02.016 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:02.016 ' 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:02.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:02.016 --rc genhtml_branch_coverage=1 00:07:02.016 --rc genhtml_function_coverage=1 00:07:02.016 --rc genhtml_legend=1 00:07:02.016 --rc geninfo_all_blocks=1 00:07:02.016 --rc geninfo_unexecuted_blocks=1 00:07:02.016 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:02.016 ' 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:02.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:02.016 --rc genhtml_branch_coverage=1 00:07:02.016 --rc genhtml_function_coverage=1 00:07:02.016 --rc genhtml_legend=1 00:07:02.016 --rc geninfo_all_blocks=1 00:07:02.016 --rc geninfo_unexecuted_blocks=1 00:07:02.016 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:02.016 ' 00:07:02.016 21:32:03 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:02.016 21:32:03 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=3311601 00:07:02.016 21:32:03 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:02.016 21:32:03 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:02.016 21:32:03 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 3311601 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 3311601 ']' 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.016 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:02.016 21:32:03 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:02.016 [2024-10-27 21:32:03.703419] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:02.016 [2024-10-27 21:32:03.703510] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3311601 ] 00:07:02.276 [2024-10-27 21:32:03.841134] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:02.276 [2024-10-27 21:32:03.871899] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:02.276 [2024-10-27 21:32:03.897845] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.276 [2024-10-27 21:32:03.897930] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:02.276 [2024-10-27 21:32:03.898018] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:02.276 [2024-10-27 21:32:03.898020] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:07:02.843 21:32:04 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:02.843 [2024-10-27 21:32:04.562772] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:02.843 [2024-10-27 21:32:04.562792] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:02.843 [2024-10-27 21:32:04.562804] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:02.843 [2024-10-27 21:32:04.562812] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:02.843 [2024-10-27 21:32:04.562821] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:02.843 21:32:04 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:02.843 21:32:04 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:03.101 [2024-10-27 21:32:04.630252] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:03.101 21:32:04 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.101 21:32:04 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:03.101 21:32:04 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:03.101 21:32:04 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:03.101 21:32:04 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:03.101 ************************************ 00:07:03.101 START TEST scheduler_create_thread 00:07:03.101 ************************************ 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.101 2 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.101 3 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.101 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.101 4 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 5 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 6 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 7 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 8 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 9 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 10 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:03.102 21:32:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:04.040 21:32:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:04.040 21:32:05 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:04.040 21:32:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:04.040 21:32:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:05.418 21:32:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:05.418 21:32:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:05.418 21:32:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:05.418 21:32:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:05.418 21:32:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:06.357 21:32:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:06.357 00:07:06.357 real 0m3.376s 00:07:06.357 user 0m0.027s 00:07:06.357 sys 0m0.005s 00:07:06.357 21:32:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.357 21:32:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:06.357 ************************************ 00:07:06.357 END TEST scheduler_create_thread 00:07:06.357 ************************************ 00:07:06.615 21:32:08 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:06.615 21:32:08 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 3311601 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 3311601 ']' 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 3311601 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3311601 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3311601' 00:07:06.615 killing process with pid 3311601 00:07:06.615 21:32:08 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 3311601 00:07:06.616 21:32:08 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 3311601 00:07:06.875 [2024-10-27 21:32:08.425165] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:07.134 00:07:07.134 real 0m5.137s 00:07:07.134 user 0m10.327s 00:07:07.134 sys 0m0.467s 00:07:07.134 21:32:08 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:07.134 21:32:08 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:07.134 ************************************ 00:07:07.134 END TEST event_scheduler 00:07:07.134 ************************************ 00:07:07.134 21:32:08 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:07.134 21:32:08 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:07.134 21:32:08 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:07.134 21:32:08 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:07.134 21:32:08 event -- common/autotest_common.sh@10 -- # set +x 00:07:07.134 ************************************ 00:07:07.134 START TEST app_repeat 00:07:07.134 ************************************ 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@19 -- # repeat_pid=3312514 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3312514' 00:07:07.134 Process app_repeat pid: 3312514 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:07.134 spdk_app_start Round 0 00:07:07.134 21:32:08 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3312514 /var/tmp/spdk-nbd.sock 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3312514 ']' 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:07.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:07.134 21:32:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:07.134 [2024-10-27 21:32:08.741903] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:07.134 [2024-10-27 21:32:08.742003] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3312514 ] 00:07:07.395 [2024-10-27 21:32:08.881405] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:07.395 [2024-10-27 21:32:08.915323] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:07.395 [2024-10-27 21:32:08.937734] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.395 [2024-10-27 21:32:08.937737] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.963 21:32:09 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:07.963 21:32:09 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:07.963 21:32:09 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:08.227 Malloc0 00:07:08.227 21:32:09 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:08.487 Malloc1 00:07:08.487 21:32:10 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.487 21:32:10 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:08.487 /dev/nbd0 00:07:08.746 21:32:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:08.747 21:32:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:08.747 1+0 records in 00:07:08.747 1+0 records out 00:07:08.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231106 s, 17.7 MB/s 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:08.747 21:32:10 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:08.747 21:32:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:08.747 21:32:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:08.747 21:32:10 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:08.747 /dev/nbd1 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:09.007 1+0 records in 00:07:09.007 1+0 records out 00:07:09.007 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000499974 s, 8.2 MB/s 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:09.007 21:32:10 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:09.007 { 00:07:09.007 "nbd_device": "/dev/nbd0", 00:07:09.007 "bdev_name": "Malloc0" 00:07:09.007 }, 00:07:09.007 { 00:07:09.007 "nbd_device": "/dev/nbd1", 00:07:09.007 "bdev_name": "Malloc1" 00:07:09.007 } 00:07:09.007 ]' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:09.007 { 00:07:09.007 "nbd_device": "/dev/nbd0", 00:07:09.007 "bdev_name": "Malloc0" 00:07:09.007 }, 00:07:09.007 { 00:07:09.007 "nbd_device": "/dev/nbd1", 00:07:09.007 "bdev_name": "Malloc1" 00:07:09.007 } 00:07:09.007 ]' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:09.007 /dev/nbd1' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:09.007 /dev/nbd1' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:09.007 21:32:10 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:09.267 256+0 records in 00:07:09.267 256+0 records out 00:07:09.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110712 s, 94.7 MB/s 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:09.267 256+0 records in 00:07:09.267 256+0 records out 00:07:09.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200226 s, 52.4 MB/s 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:09.267 256+0 records in 00:07:09.267 256+0 records out 00:07:09.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212436 s, 49.4 MB/s 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:09.267 21:32:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:09.527 21:32:11 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:09.787 21:32:11 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:09.787 21:32:11 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:10.046 21:32:11 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:10.306 [2024-10-27 21:32:11.849012] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:10.306 [2024-10-27 21:32:11.868677] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.306 [2024-10-27 21:32:11.868679] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.306 [2024-10-27 21:32:11.908651] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:10.306 [2024-10-27 21:32:11.908696] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:13.598 21:32:14 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:13.598 21:32:14 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:13.598 spdk_app_start Round 1 00:07:13.598 21:32:14 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3312514 /var/tmp/spdk-nbd.sock 00:07:13.598 21:32:14 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3312514 ']' 00:07:13.598 21:32:14 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:13.598 21:32:14 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:13.599 21:32:14 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:13.599 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:13.599 21:32:14 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:13.599 21:32:14 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:13.599 21:32:14 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:13.599 21:32:14 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:13.599 21:32:14 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:13.599 Malloc0 00:07:13.599 21:32:15 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:13.599 Malloc1 00:07:13.599 21:32:15 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.599 21:32:15 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:13.858 /dev/nbd0 00:07:13.858 21:32:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:13.858 21:32:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:13.858 1+0 records in 00:07:13.858 1+0 records out 00:07:13.858 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259366 s, 15.8 MB/s 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:13.858 21:32:15 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:13.859 21:32:15 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:13.859 21:32:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:13.859 21:32:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:13.859 21:32:15 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:14.118 /dev/nbd1 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:14.118 1+0 records in 00:07:14.118 1+0 records out 00:07:14.118 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284434 s, 14.4 MB/s 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:14.118 21:32:15 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.118 21:32:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:14.378 { 00:07:14.378 "nbd_device": "/dev/nbd0", 00:07:14.378 "bdev_name": "Malloc0" 00:07:14.378 }, 00:07:14.378 { 00:07:14.378 "nbd_device": "/dev/nbd1", 00:07:14.378 "bdev_name": "Malloc1" 00:07:14.378 } 00:07:14.378 ]' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:14.378 { 00:07:14.378 "nbd_device": "/dev/nbd0", 00:07:14.378 "bdev_name": "Malloc0" 00:07:14.378 }, 00:07:14.378 { 00:07:14.378 "nbd_device": "/dev/nbd1", 00:07:14.378 "bdev_name": "Malloc1" 00:07:14.378 } 00:07:14.378 ]' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:14.378 /dev/nbd1' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:14.378 /dev/nbd1' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:14.378 21:32:15 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:14.378 256+0 records in 00:07:14.378 256+0 records out 00:07:14.378 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0100692 s, 104 MB/s 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:14.378 256+0 records in 00:07:14.378 256+0 records out 00:07:14.378 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200368 s, 52.3 MB/s 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:14.378 256+0 records in 00:07:14.378 256+0 records out 00:07:14.378 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211011 s, 49.7 MB/s 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:14.378 21:32:16 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:14.638 21:32:16 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.897 21:32:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:15.156 21:32:16 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:15.156 21:32:16 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:15.416 21:32:16 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:15.416 [2024-10-27 21:32:17.069956] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:15.416 [2024-10-27 21:32:17.089715] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.416 [2024-10-27 21:32:17.089717] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.416 [2024-10-27 21:32:17.130698] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:15.416 [2024-10-27 21:32:17.130741] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:18.708 21:32:19 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:18.708 21:32:19 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:18.708 spdk_app_start Round 2 00:07:18.708 21:32:19 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3312514 /var/tmp/spdk-nbd.sock 00:07:18.708 21:32:19 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3312514 ']' 00:07:18.708 21:32:19 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:18.708 21:32:19 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:18.708 21:32:19 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:18.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:18.708 21:32:19 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:18.708 21:32:19 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:18.708 21:32:20 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:18.708 21:32:20 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:18.708 21:32:20 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:18.708 Malloc0 00:07:18.708 21:32:20 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:18.968 Malloc1 00:07:18.968 21:32:20 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:18.968 21:32:20 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:19.227 /dev/nbd0 00:07:19.227 21:32:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:19.227 21:32:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:19.227 1+0 records in 00:07:19.227 1+0 records out 00:07:19.227 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239121 s, 17.1 MB/s 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:19.227 21:32:20 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:19.227 21:32:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:19.227 21:32:20 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:19.227 21:32:20 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:19.486 /dev/nbd1 00:07:19.486 21:32:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:19.486 21:32:20 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:19.486 21:32:20 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:19.486 21:32:20 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:19.486 21:32:20 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:19.486 21:32:20 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:19.486 21:32:20 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:19.486 1+0 records in 00:07:19.486 1+0 records out 00:07:19.486 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272731 s, 15.0 MB/s 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:19.486 21:32:21 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:19.486 21:32:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:19.486 21:32:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:19.486 21:32:21 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:19.486 21:32:21 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.486 21:32:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:19.486 21:32:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:19.486 { 00:07:19.486 "nbd_device": "/dev/nbd0", 00:07:19.486 "bdev_name": "Malloc0" 00:07:19.487 }, 00:07:19.487 { 00:07:19.487 "nbd_device": "/dev/nbd1", 00:07:19.487 "bdev_name": "Malloc1" 00:07:19.487 } 00:07:19.487 ]' 00:07:19.487 21:32:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:19.745 { 00:07:19.745 "nbd_device": "/dev/nbd0", 00:07:19.745 "bdev_name": "Malloc0" 00:07:19.745 }, 00:07:19.745 { 00:07:19.745 "nbd_device": "/dev/nbd1", 00:07:19.745 "bdev_name": "Malloc1" 00:07:19.745 } 00:07:19.745 ]' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:19.745 /dev/nbd1' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:19.745 /dev/nbd1' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:19.745 256+0 records in 00:07:19.745 256+0 records out 00:07:19.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116055 s, 90.4 MB/s 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:19.745 256+0 records in 00:07:19.745 256+0 records out 00:07:19.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019995 s, 52.4 MB/s 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:19.745 256+0 records in 00:07:19.745 256+0 records out 00:07:19.745 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214583 s, 48.9 MB/s 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.745 21:32:21 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:20.004 21:32:21 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:20.263 21:32:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:20.523 21:32:21 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:20.523 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:20.523 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:20.523 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:20.523 21:32:21 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:20.523 21:32:21 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:20.523 21:32:22 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:20.523 21:32:22 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:20.523 21:32:22 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:20.523 21:32:22 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:20.523 21:32:22 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:20.782 [2024-10-27 21:32:22.353436] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:20.782 [2024-10-27 21:32:22.373356] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.782 [2024-10-27 21:32:22.373357] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.782 [2024-10-27 21:32:22.413522] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:20.782 [2024-10-27 21:32:22.413566] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:24.073 21:32:25 event.app_repeat -- event/event.sh@38 -- # waitforlisten 3312514 /var/tmp/spdk-nbd.sock 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3312514 ']' 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:24.073 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:24.073 21:32:25 event.app_repeat -- event/event.sh@39 -- # killprocess 3312514 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 3312514 ']' 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 3312514 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3312514 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3312514' 00:07:24.073 killing process with pid 3312514 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@969 -- # kill 3312514 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@974 -- # wait 3312514 00:07:24.073 spdk_app_start is called in Round 0. 00:07:24.073 Shutdown signal received, stop current app iteration 00:07:24.073 Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 reinitialization... 00:07:24.073 spdk_app_start is called in Round 1. 00:07:24.073 Shutdown signal received, stop current app iteration 00:07:24.073 Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 reinitialization... 00:07:24.073 spdk_app_start is called in Round 2. 00:07:24.073 Shutdown signal received, stop current app iteration 00:07:24.073 Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 reinitialization... 00:07:24.073 spdk_app_start is called in Round 3. 00:07:24.073 Shutdown signal received, stop current app iteration 00:07:24.073 21:32:25 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:24.073 21:32:25 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:24.073 00:07:24.073 real 0m16.881s 00:07:24.073 user 0m36.263s 00:07:24.073 sys 0m3.182s 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:24.073 21:32:25 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:24.073 ************************************ 00:07:24.073 END TEST app_repeat 00:07:24.073 ************************************ 00:07:24.073 21:32:25 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:24.073 21:32:25 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:24.073 21:32:25 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:24.073 21:32:25 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:24.073 21:32:25 event -- common/autotest_common.sh@10 -- # set +x 00:07:24.073 ************************************ 00:07:24.073 START TEST cpu_locks 00:07:24.073 ************************************ 00:07:24.073 21:32:25 event.cpu_locks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:24.073 * Looking for test storage... 00:07:24.073 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:24.073 21:32:25 event.cpu_locks -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:24.073 21:32:25 event.cpu_locks -- common/autotest_common.sh@1689 -- # lcov --version 00:07:24.073 21:32:25 event.cpu_locks -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:24.333 21:32:25 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:24.333 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.333 --rc genhtml_branch_coverage=1 00:07:24.333 --rc genhtml_function_coverage=1 00:07:24.333 --rc genhtml_legend=1 00:07:24.333 --rc geninfo_all_blocks=1 00:07:24.333 --rc geninfo_unexecuted_blocks=1 00:07:24.333 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.333 ' 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:24.333 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.333 --rc genhtml_branch_coverage=1 00:07:24.333 --rc genhtml_function_coverage=1 00:07:24.333 --rc genhtml_legend=1 00:07:24.333 --rc geninfo_all_blocks=1 00:07:24.333 --rc geninfo_unexecuted_blocks=1 00:07:24.333 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.333 ' 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:24.333 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.333 --rc genhtml_branch_coverage=1 00:07:24.333 --rc genhtml_function_coverage=1 00:07:24.333 --rc genhtml_legend=1 00:07:24.333 --rc geninfo_all_blocks=1 00:07:24.333 --rc geninfo_unexecuted_blocks=1 00:07:24.333 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.333 ' 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:24.333 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.333 --rc genhtml_branch_coverage=1 00:07:24.333 --rc genhtml_function_coverage=1 00:07:24.333 --rc genhtml_legend=1 00:07:24.333 --rc geninfo_all_blocks=1 00:07:24.333 --rc geninfo_unexecuted_blocks=1 00:07:24.333 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.333 ' 00:07:24.333 21:32:25 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:24.333 21:32:25 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:24.333 21:32:25 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:24.333 21:32:25 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:24.333 21:32:25 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:24.333 ************************************ 00:07:24.333 START TEST default_locks 00:07:24.333 ************************************ 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # default_locks 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3315686 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 3315686 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 3315686 ']' 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.333 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:24.334 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.334 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:24.334 21:32:25 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:24.334 [2024-10-27 21:32:25.926646] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:24.334 [2024-10-27 21:32:25.926721] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3315686 ] 00:07:24.593 [2024-10-27 21:32:26.062491] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:24.593 [2024-10-27 21:32:26.096837] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.593 [2024-10-27 21:32:26.119112] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.161 21:32:26 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:25.161 21:32:26 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 0 00:07:25.161 21:32:26 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 3315686 00:07:25.161 21:32:26 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 3315686 00:07:25.161 21:32:26 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:25.730 lslocks: write error 00:07:25.730 21:32:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 3315686 00:07:25.730 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@950 -- # '[' -z 3315686 ']' 00:07:25.730 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # kill -0 3315686 00:07:25.730 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # uname 00:07:25.730 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:25.730 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3315686 00:07:25.990 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:25.990 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:25.990 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3315686' 00:07:25.990 killing process with pid 3315686 00:07:25.990 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@969 -- # kill 3315686 00:07:25.990 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@974 -- # wait 3315686 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3315686 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@650 -- # local es=0 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3315686 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # waitforlisten 3315686 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 3315686 ']' 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:26.250 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (3315686) - No such process 00:07:26.250 ERROR: process (pid: 3315686) is no longer running 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 1 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # es=1 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:26.250 00:07:26.250 real 0m1.862s 00:07:26.250 user 0m1.875s 00:07:26.250 sys 0m0.698s 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:26.250 21:32:27 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:26.250 ************************************ 00:07:26.250 END TEST default_locks 00:07:26.250 ************************************ 00:07:26.250 21:32:27 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:26.250 21:32:27 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:26.250 21:32:27 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:26.250 21:32:27 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:26.250 ************************************ 00:07:26.250 START TEST default_locks_via_rpc 00:07:26.250 ************************************ 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # default_locks_via_rpc 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3315990 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 3315990 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3315990 ']' 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:26.250 21:32:27 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:26.250 [2024-10-27 21:32:27.848348] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:26.250 [2024-10-27 21:32:27.848424] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3315990 ] 00:07:26.510 [2024-10-27 21:32:27.984918] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:26.510 [2024-10-27 21:32:28.020071] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.510 [2024-10-27 21:32:28.042216] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 3315990 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 3315990 00:07:27.078 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 3315990 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@950 -- # '[' -z 3315990 ']' 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # kill -0 3315990 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # uname 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3315990 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3315990' 00:07:27.338 killing process with pid 3315990 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@969 -- # kill 3315990 00:07:27.338 21:32:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@974 -- # wait 3315990 00:07:27.598 00:07:27.598 real 0m1.442s 00:07:27.598 user 0m1.418s 00:07:27.598 sys 0m0.490s 00:07:27.598 21:32:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:27.598 21:32:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:27.598 ************************************ 00:07:27.598 END TEST default_locks_via_rpc 00:07:27.598 ************************************ 00:07:27.598 21:32:29 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:27.598 21:32:29 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:27.598 21:32:29 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:27.598 21:32:29 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:27.857 ************************************ 00:07:27.857 START TEST non_locking_app_on_locked_coremask 00:07:27.857 ************************************ 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # non_locking_app_on_locked_coremask 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3316288 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 3316288 /var/tmp/spdk.sock 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3316288 ']' 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:27.857 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:27.857 21:32:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:27.857 [2024-10-27 21:32:29.362604] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:27.857 [2024-10-27 21:32:29.362660] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3316288 ] 00:07:27.857 [2024-10-27 21:32:29.497813] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:27.857 [2024-10-27 21:32:29.533312] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.857 [2024-10-27 21:32:29.555499] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3316548 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 3316548 /var/tmp/spdk2.sock 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3316548 ']' 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:28.794 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:28.794 21:32:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:28.794 [2024-10-27 21:32:30.228168] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:28.794 [2024-10-27 21:32:30.228219] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3316548 ] 00:07:28.794 [2024-10-27 21:32:30.365420] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:28.794 [2024-10-27 21:32:30.423895] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:28.794 [2024-10-27 21:32:30.423919] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.794 [2024-10-27 21:32:30.471404] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.732 21:32:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:29.732 21:32:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:29.732 21:32:31 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 3316288 00:07:29.732 21:32:31 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 3316288 00:07:29.732 21:32:31 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:30.671 lslocks: write error 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 3316288 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3316288 ']' 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 3316288 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3316288 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3316288' 00:07:30.671 killing process with pid 3316288 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 3316288 00:07:30.671 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 3316288 00:07:31.241 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 3316548 00:07:31.241 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3316548 ']' 00:07:31.241 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 3316548 00:07:31.241 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:31.241 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:31.241 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3316548 00:07:31.501 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:31.501 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:31.501 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3316548' 00:07:31.501 killing process with pid 3316548 00:07:31.501 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 3316548 00:07:31.501 21:32:32 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 3316548 00:07:31.760 00:07:31.760 real 0m3.924s 00:07:31.760 user 0m4.132s 00:07:31.760 sys 0m1.318s 00:07:31.760 21:32:33 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:31.760 21:32:33 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.760 ************************************ 00:07:31.760 END TEST non_locking_app_on_locked_coremask 00:07:31.760 ************************************ 00:07:31.760 21:32:33 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:31.760 21:32:33 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:31.760 21:32:33 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:31.760 21:32:33 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.760 ************************************ 00:07:31.760 START TEST locking_app_on_unlocked_coremask 00:07:31.760 ************************************ 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_unlocked_coremask 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3317114 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 3317114 /var/tmp/spdk.sock 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3317114 ']' 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.760 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:31.760 21:32:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.760 [2024-10-27 21:32:33.376002] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:31.760 [2024-10-27 21:32:33.376086] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3317114 ] 00:07:32.020 [2024-10-27 21:32:33.512364] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:32.020 [2024-10-27 21:32:33.548051] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:32.020 [2024-10-27 21:32:33.548074] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.020 [2024-10-27 21:32:33.568188] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3317135 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 3317135 /var/tmp/spdk2.sock 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3317135 ']' 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:32.589 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:32.589 21:32:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:32.589 [2024-10-27 21:32:34.245389] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:32.589 [2024-10-27 21:32:34.245477] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3317135 ] 00:07:32.848 [2024-10-27 21:32:34.383414] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:32.848 [2024-10-27 21:32:34.441490] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.848 [2024-10-27 21:32:34.487450] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.417 21:32:35 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:33.417 21:32:35 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:33.417 21:32:35 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 3317135 00:07:33.417 21:32:35 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 3317135 00:07:33.417 21:32:35 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:34.798 lslocks: write error 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 3317114 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3317114 ']' 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 3317114 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3317114 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3317114' 00:07:34.798 killing process with pid 3317114 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 3317114 00:07:34.798 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 3317114 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 3317135 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3317135 ']' 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 3317135 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3317135 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3317135' 00:07:35.367 killing process with pid 3317135 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 3317135 00:07:35.367 21:32:36 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 3317135 00:07:35.627 00:07:35.627 real 0m3.859s 00:07:35.627 user 0m4.080s 00:07:35.627 sys 0m1.274s 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:35.627 ************************************ 00:07:35.627 END TEST locking_app_on_unlocked_coremask 00:07:35.627 ************************************ 00:07:35.627 21:32:37 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:35.627 21:32:37 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:35.627 21:32:37 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:35.627 21:32:37 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:35.627 ************************************ 00:07:35.627 START TEST locking_app_on_locked_coremask 00:07:35.627 ************************************ 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_locked_coremask 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3317702 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 3317702 /var/tmp/spdk.sock 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3317702 ']' 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.627 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:35.627 21:32:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:35.627 [2024-10-27 21:32:37.292315] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:35.627 [2024-10-27 21:32:37.292371] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3317702 ] 00:07:35.887 [2024-10-27 21:32:37.427526] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:35.887 [2024-10-27 21:32:37.462494] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.887 [2024-10-27 21:32:37.484368] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3317963 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3317963 /var/tmp/spdk2.sock 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3317963 /var/tmp/spdk2.sock 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # waitforlisten 3317963 /var/tmp/spdk2.sock 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3317963 ']' 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:36.457 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:36.457 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:36.457 [2024-10-27 21:32:38.132653] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:36.457 [2024-10-27 21:32:38.132700] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3317963 ] 00:07:36.716 [2024-10-27 21:32:38.267418] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:36.716 [2024-10-27 21:32:38.326195] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3317702 has claimed it. 00:07:36.716 [2024-10-27 21:32:38.326227] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:37.286 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (3317963) - No such process 00:07:37.286 ERROR: process (pid: 3317963) is no longer running 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 3317702 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 3317702 00:07:37.286 21:32:38 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:37.545 lslocks: write error 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 3317702 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3317702 ']' 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 3317702 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3317702 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3317702' 00:07:37.545 killing process with pid 3317702 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 3317702 00:07:37.545 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 3317702 00:07:37.804 00:07:37.804 real 0m2.111s 00:07:37.804 user 0m2.223s 00:07:37.804 sys 0m0.569s 00:07:37.804 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:37.804 21:32:39 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:37.804 ************************************ 00:07:37.804 END TEST locking_app_on_locked_coremask 00:07:37.804 ************************************ 00:07:37.804 21:32:39 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:37.804 21:32:39 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:37.804 21:32:39 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:37.804 21:32:39 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:37.804 ************************************ 00:07:37.804 START TEST locking_overlapped_coremask 00:07:37.804 ************************************ 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3318259 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 3318259 /var/tmp/spdk.sock 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 3318259 ']' 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:37.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:37.804 21:32:39 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:37.804 [2024-10-27 21:32:39.476045] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:37.804 [2024-10-27 21:32:39.476102] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3318259 ] 00:07:38.063 [2024-10-27 21:32:39.611640] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:38.063 [2024-10-27 21:32:39.647208] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:38.063 [2024-10-27 21:32:39.671902] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.063 [2024-10-27 21:32:39.672005] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:38.063 [2024-10-27 21:32:39.672008] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3318276 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3318276 /var/tmp/spdk2.sock 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3318276 /var/tmp/spdk2.sock 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # waitforlisten 3318276 /var/tmp/spdk2.sock 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 3318276 ']' 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:38.633 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:38.633 21:32:40 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:38.633 [2024-10-27 21:32:40.348736] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:38.633 [2024-10-27 21:32:40.348805] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3318276 ] 00:07:38.891 [2024-10-27 21:32:40.487519] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:38.891 [2024-10-27 21:32:40.551260] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3318259 has claimed it. 00:07:38.891 [2024-10-27 21:32:40.551296] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:39.457 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (3318276) - No such process 00:07:39.457 ERROR: process (pid: 3318276) is no longer running 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 3318259 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@950 -- # '[' -z 3318259 ']' 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # kill -0 3318259 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # uname 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3318259 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3318259' 00:07:39.457 killing process with pid 3318259 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@969 -- # kill 3318259 00:07:39.457 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@974 -- # wait 3318259 00:07:39.715 00:07:39.715 real 0m1.923s 00:07:39.715 user 0m5.309s 00:07:39.715 sys 0m0.488s 00:07:39.715 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:39.715 21:32:41 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.715 ************************************ 00:07:39.715 END TEST locking_overlapped_coremask 00:07:39.715 ************************************ 00:07:39.715 21:32:41 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:39.715 21:32:41 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:39.715 21:32:41 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:39.715 21:32:41 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:39.974 ************************************ 00:07:39.974 START TEST locking_overlapped_coremask_via_rpc 00:07:39.974 ************************************ 00:07:39.974 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask_via_rpc 00:07:39.974 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3318568 00:07:39.974 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 3318568 /var/tmp/spdk.sock 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3318568 ']' 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:39.975 21:32:41 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:39.975 [2024-10-27 21:32:41.483805] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:39.975 [2024-10-27 21:32:41.483866] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3318568 ] 00:07:39.975 [2024-10-27 21:32:41.621413] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:39.975 [2024-10-27 21:32:41.657815] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:39.975 [2024-10-27 21:32:41.657838] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:39.975 [2024-10-27 21:32:41.682962] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:39.975 [2024-10-27 21:32:41.683017] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:39.975 [2024-10-27 21:32:41.683019] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.909 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:40.909 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:40.909 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3318731 00:07:40.909 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 3318731 /var/tmp/spdk2.sock 00:07:40.909 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:40.910 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3318731 ']' 00:07:40.910 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:40.910 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:40.910 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:40.910 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:40.910 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:40.910 21:32:42 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:40.910 [2024-10-27 21:32:42.366623] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:40.910 [2024-10-27 21:32:42.366703] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3318731 ] 00:07:40.910 [2024-10-27 21:32:42.507746] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:40.910 [2024-10-27 21:32:42.570387] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:40.910 [2024-10-27 21:32:42.570411] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:40.910 [2024-10-27 21:32:42.619102] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:40.910 [2024-10-27 21:32:42.619222] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:40.910 [2024-10-27 21:32:42.619223] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:41.847 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:41.847 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@650 -- # local es=0 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.848 [2024-10-27 21:32:43.254001] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3318568 has claimed it. 00:07:41.848 request: 00:07:41.848 { 00:07:41.848 "method": "framework_enable_cpumask_locks", 00:07:41.848 "req_id": 1 00:07:41.848 } 00:07:41.848 Got JSON-RPC error response 00:07:41.848 response: 00:07:41.848 { 00:07:41.848 "code": -32603, 00:07:41.848 "message": "Failed to claim CPU core: 2" 00:07:41.848 } 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # es=1 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 3318568 /var/tmp/spdk.sock 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3318568 ']' 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.848 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 3318731 /var/tmp/spdk2.sock 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3318731 ']' 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:41.848 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:41.848 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:42.108 00:07:42.108 real 0m2.197s 00:07:42.108 user 0m0.905s 00:07:42.108 sys 0m0.224s 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:42.108 21:32:43 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:42.108 ************************************ 00:07:42.108 END TEST locking_overlapped_coremask_via_rpc 00:07:42.108 ************************************ 00:07:42.108 21:32:43 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:42.108 21:32:43 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 3318568 ]] 00:07:42.108 21:32:43 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 3318568 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3318568 ']' 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3318568 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3318568 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3318568' 00:07:42.108 killing process with pid 3318568 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 3318568 00:07:42.108 21:32:43 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 3318568 00:07:42.367 21:32:44 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 3318731 ]] 00:07:42.367 21:32:44 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 3318731 00:07:42.367 21:32:44 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3318731 ']' 00:07:42.367 21:32:44 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3318731 00:07:42.367 21:32:44 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:42.367 21:32:44 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:42.367 21:32:44 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3318731 00:07:42.626 21:32:44 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:42.626 21:32:44 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:42.626 21:32:44 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3318731' 00:07:42.626 killing process with pid 3318731 00:07:42.626 21:32:44 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 3318731 00:07:42.626 21:32:44 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 3318731 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 3318568 ]] 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 3318568 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3318568 ']' 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3318568 00:07:42.886 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (3318568) - No such process 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 3318568 is not found' 00:07:42.886 Process with pid 3318568 is not found 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 3318731 ]] 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 3318731 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3318731 ']' 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3318731 00:07:42.886 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (3318731) - No such process 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 3318731 is not found' 00:07:42.886 Process with pid 3318731 is not found 00:07:42.886 21:32:44 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:42.886 00:07:42.886 real 0m18.751s 00:07:42.886 user 0m30.838s 00:07:42.886 sys 0m6.134s 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:42.886 21:32:44 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:42.886 ************************************ 00:07:42.886 END TEST cpu_locks 00:07:42.886 ************************************ 00:07:42.886 00:07:42.886 real 0m45.210s 00:07:42.886 user 1m23.869s 00:07:42.886 sys 0m10.512s 00:07:42.886 21:32:44 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:42.886 21:32:44 event -- common/autotest_common.sh@10 -- # set +x 00:07:42.886 ************************************ 00:07:42.886 END TEST event 00:07:42.886 ************************************ 00:07:42.886 21:32:44 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:42.886 21:32:44 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:42.886 21:32:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:42.886 21:32:44 -- common/autotest_common.sh@10 -- # set +x 00:07:42.886 ************************************ 00:07:42.886 START TEST thread 00:07:42.886 ************************************ 00:07:42.886 21:32:44 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:43.146 * Looking for test storage... 00:07:43.146 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1689 -- # lcov --version 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:43.146 21:32:44 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:43.146 21:32:44 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:43.146 21:32:44 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:43.146 21:32:44 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:43.146 21:32:44 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:43.146 21:32:44 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:43.146 21:32:44 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:43.146 21:32:44 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:43.146 21:32:44 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:43.146 21:32:44 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:43.146 21:32:44 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:43.146 21:32:44 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:43.146 21:32:44 thread -- scripts/common.sh@345 -- # : 1 00:07:43.146 21:32:44 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:43.146 21:32:44 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:43.146 21:32:44 thread -- scripts/common.sh@365 -- # decimal 1 00:07:43.146 21:32:44 thread -- scripts/common.sh@353 -- # local d=1 00:07:43.146 21:32:44 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:43.146 21:32:44 thread -- scripts/common.sh@355 -- # echo 1 00:07:43.146 21:32:44 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:43.146 21:32:44 thread -- scripts/common.sh@366 -- # decimal 2 00:07:43.146 21:32:44 thread -- scripts/common.sh@353 -- # local d=2 00:07:43.146 21:32:44 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:43.146 21:32:44 thread -- scripts/common.sh@355 -- # echo 2 00:07:43.146 21:32:44 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:43.146 21:32:44 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:43.146 21:32:44 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:43.146 21:32:44 thread -- scripts/common.sh@368 -- # return 0 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:43.146 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.146 --rc genhtml_branch_coverage=1 00:07:43.146 --rc genhtml_function_coverage=1 00:07:43.146 --rc genhtml_legend=1 00:07:43.146 --rc geninfo_all_blocks=1 00:07:43.146 --rc geninfo_unexecuted_blocks=1 00:07:43.146 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.146 ' 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:43.146 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.146 --rc genhtml_branch_coverage=1 00:07:43.146 --rc genhtml_function_coverage=1 00:07:43.146 --rc genhtml_legend=1 00:07:43.146 --rc geninfo_all_blocks=1 00:07:43.146 --rc geninfo_unexecuted_blocks=1 00:07:43.146 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.146 ' 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:43.146 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.146 --rc genhtml_branch_coverage=1 00:07:43.146 --rc genhtml_function_coverage=1 00:07:43.146 --rc genhtml_legend=1 00:07:43.146 --rc geninfo_all_blocks=1 00:07:43.146 --rc geninfo_unexecuted_blocks=1 00:07:43.146 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.146 ' 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:43.146 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.146 --rc genhtml_branch_coverage=1 00:07:43.146 --rc genhtml_function_coverage=1 00:07:43.146 --rc genhtml_legend=1 00:07:43.146 --rc geninfo_all_blocks=1 00:07:43.146 --rc geninfo_unexecuted_blocks=1 00:07:43.146 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.146 ' 00:07:43.146 21:32:44 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:43.146 21:32:44 thread -- common/autotest_common.sh@10 -- # set +x 00:07:43.146 ************************************ 00:07:43.146 START TEST thread_poller_perf 00:07:43.146 ************************************ 00:07:43.146 21:32:44 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:43.146 [2024-10-27 21:32:44.775157] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:43.146 [2024-10-27 21:32:44.775237] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3319214 ] 00:07:43.433 [2024-10-27 21:32:44.913681] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:43.433 [2024-10-27 21:32:44.946438] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.433 [2024-10-27 21:32:44.969413] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.433 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:44.451 [2024-10-27T20:32:46.179Z] ====================================== 00:07:44.451 [2024-10-27T20:32:46.179Z] busy:2499011450 (cyc) 00:07:44.451 [2024-10-27T20:32:46.179Z] total_run_count: 861000 00:07:44.451 [2024-10-27T20:32:46.179Z] tsc_hz: 2494100000 (cyc) 00:07:44.451 [2024-10-27T20:32:46.179Z] ====================================== 00:07:44.451 [2024-10-27T20:32:46.179Z] poller_cost: 2902 (cyc), 1163 (nsec) 00:07:44.451 00:07:44.451 real 0m1.241s 00:07:44.451 user 0m1.054s 00:07:44.451 sys 0m0.082s 00:07:44.451 21:32:45 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:44.451 21:32:45 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:44.451 ************************************ 00:07:44.451 END TEST thread_poller_perf 00:07:44.451 ************************************ 00:07:44.451 21:32:46 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:44.451 21:32:46 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:44.451 21:32:46 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:44.451 21:32:46 thread -- common/autotest_common.sh@10 -- # set +x 00:07:44.451 ************************************ 00:07:44.451 START TEST thread_poller_perf 00:07:44.451 ************************************ 00:07:44.451 21:32:46 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:44.451 [2024-10-27 21:32:46.099983] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:44.451 [2024-10-27 21:32:46.100067] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3319506 ] 00:07:44.710 [2024-10-27 21:32:46.240394] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:44.710 [2024-10-27 21:32:46.274715] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.710 [2024-10-27 21:32:46.297745] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.710 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:45.648 [2024-10-27T20:32:47.376Z] ====================================== 00:07:45.648 [2024-10-27T20:32:47.376Z] busy:2495637322 (cyc) 00:07:45.648 [2024-10-27T20:32:47.376Z] total_run_count: 13455000 00:07:45.648 [2024-10-27T20:32:47.376Z] tsc_hz: 2494100000 (cyc) 00:07:45.648 [2024-10-27T20:32:47.376Z] ====================================== 00:07:45.648 [2024-10-27T20:32:47.376Z] poller_cost: 185 (cyc), 74 (nsec) 00:07:45.648 00:07:45.648 real 0m1.241s 00:07:45.648 user 0m1.054s 00:07:45.648 sys 0m0.083s 00:07:45.648 21:32:47 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:45.648 21:32:47 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:45.648 ************************************ 00:07:45.648 END TEST thread_poller_perf 00:07:45.648 ************************************ 00:07:45.648 21:32:47 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:45.648 21:32:47 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:45.648 21:32:47 thread -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:45.648 21:32:47 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:45.648 21:32:47 thread -- common/autotest_common.sh@10 -- # set +x 00:07:45.908 ************************************ 00:07:45.908 START TEST thread_spdk_lock 00:07:45.908 ************************************ 00:07:45.908 21:32:47 thread.thread_spdk_lock -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:45.908 [2024-10-27 21:32:47.417699] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:45.908 [2024-10-27 21:32:47.417791] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3319793 ] 00:07:45.908 [2024-10-27 21:32:47.558068] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:45.908 [2024-10-27 21:32:47.591707] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:45.908 [2024-10-27 21:32:47.618025] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:45.908 [2024-10-27 21:32:47.618028] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.478 [2024-10-27 21:32:48.108775] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:46.479 [2024-10-27 21:32:48.108809] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3099:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:46.479 [2024-10-27 21:32:48.108819] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3054:sspin_stacks_print: *ERROR*: spinlock 0x134cb40 00:07:46.479 [2024-10-27 21:32:48.109464] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:46.479 [2024-10-27 21:32:48.109567] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:46.479 [2024-10-27 21:32:48.109585] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:46.479 Starting test contend 00:07:46.479 Worker Delay Wait us Hold us Total us 00:07:46.479 0 3 171513 186533 358047 00:07:46.479 1 5 87047 287379 374426 00:07:46.479 PASS test contend 00:07:46.479 Starting test hold_by_poller 00:07:46.479 PASS test hold_by_poller 00:07:46.479 Starting test hold_by_message 00:07:46.479 PASS test hold_by_message 00:07:46.479 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:46.479 100014 assertions passed 00:07:46.479 0 assertions failed 00:07:46.479 00:07:46.479 real 0m0.735s 00:07:46.479 user 0m1.038s 00:07:46.479 sys 0m0.085s 00:07:46.479 21:32:48 thread.thread_spdk_lock -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:46.479 21:32:48 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:46.479 ************************************ 00:07:46.479 END TEST thread_spdk_lock 00:07:46.479 ************************************ 00:07:46.479 00:07:46.479 real 0m3.639s 00:07:46.479 user 0m3.317s 00:07:46.479 sys 0m0.535s 00:07:46.479 21:32:48 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:46.479 21:32:48 thread -- common/autotest_common.sh@10 -- # set +x 00:07:46.479 ************************************ 00:07:46.479 END TEST thread 00:07:46.479 ************************************ 00:07:46.738 21:32:48 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:46.738 21:32:48 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:46.738 21:32:48 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:46.738 21:32:48 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:46.738 21:32:48 -- common/autotest_common.sh@10 -- # set +x 00:07:46.738 ************************************ 00:07:46.738 START TEST app_cmdline 00:07:46.738 ************************************ 00:07:46.738 21:32:48 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:46.738 * Looking for test storage... 00:07:46.738 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:46.738 21:32:48 app_cmdline -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:46.738 21:32:48 app_cmdline -- common/autotest_common.sh@1689 -- # lcov --version 00:07:46.738 21:32:48 app_cmdline -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:46.738 21:32:48 app_cmdline -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:46.738 21:32:48 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:46.738 21:32:48 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:46.738 21:32:48 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:46.738 21:32:48 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:46.738 21:32:48 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:46.738 21:32:48 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:46.739 21:32:48 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:46.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.739 --rc genhtml_branch_coverage=1 00:07:46.739 --rc genhtml_function_coverage=1 00:07:46.739 --rc genhtml_legend=1 00:07:46.739 --rc geninfo_all_blocks=1 00:07:46.739 --rc geninfo_unexecuted_blocks=1 00:07:46.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.739 ' 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:46.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.739 --rc genhtml_branch_coverage=1 00:07:46.739 --rc genhtml_function_coverage=1 00:07:46.739 --rc genhtml_legend=1 00:07:46.739 --rc geninfo_all_blocks=1 00:07:46.739 --rc geninfo_unexecuted_blocks=1 00:07:46.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.739 ' 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:46.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.739 --rc genhtml_branch_coverage=1 00:07:46.739 --rc genhtml_function_coverage=1 00:07:46.739 --rc genhtml_legend=1 00:07:46.739 --rc geninfo_all_blocks=1 00:07:46.739 --rc geninfo_unexecuted_blocks=1 00:07:46.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.739 ' 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:46.739 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.739 --rc genhtml_branch_coverage=1 00:07:46.739 --rc genhtml_function_coverage=1 00:07:46.739 --rc genhtml_legend=1 00:07:46.739 --rc geninfo_all_blocks=1 00:07:46.739 --rc geninfo_unexecuted_blocks=1 00:07:46.739 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.739 ' 00:07:46.739 21:32:48 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:46.739 21:32:48 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=3319948 00:07:46.739 21:32:48 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 3319948 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 3319948 ']' 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:46.739 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:46.739 21:32:48 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:46.739 21:32:48 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:46.739 [2024-10-27 21:32:48.456204] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:46.739 [2024-10-27 21:32:48.456269] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3319948 ] 00:07:46.998 [2024-10-27 21:32:48.591887] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:46.998 [2024-10-27 21:32:48.626701] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.998 [2024-10-27 21:32:48.648903] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.567 21:32:49 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:47.567 21:32:49 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:07:47.567 21:32:49 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:47.827 { 00:07:47.827 "version": "SPDK v25.01-pre git sha1 169c3cd04", 00:07:47.827 "fields": { 00:07:47.827 "major": 25, 00:07:47.827 "minor": 1, 00:07:47.827 "patch": 0, 00:07:47.827 "suffix": "-pre", 00:07:47.827 "commit": "169c3cd04" 00:07:47.827 } 00:07:47.827 } 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:47.827 21:32:49 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:47.827 21:32:49 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:48.089 request: 00:07:48.089 { 00:07:48.089 "method": "env_dpdk_get_mem_stats", 00:07:48.089 "req_id": 1 00:07:48.089 } 00:07:48.089 Got JSON-RPC error response 00:07:48.089 response: 00:07:48.089 { 00:07:48.089 "code": -32601, 00:07:48.089 "message": "Method not found" 00:07:48.089 } 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:48.089 21:32:49 app_cmdline -- app/cmdline.sh@1 -- # killprocess 3319948 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 3319948 ']' 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 3319948 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3319948 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3319948' 00:07:48.089 killing process with pid 3319948 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@969 -- # kill 3319948 00:07:48.089 21:32:49 app_cmdline -- common/autotest_common.sh@974 -- # wait 3319948 00:07:48.351 00:07:48.351 real 0m1.785s 00:07:48.351 user 0m1.985s 00:07:48.351 sys 0m0.513s 00:07:48.351 21:32:50 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:48.351 21:32:50 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:48.351 ************************************ 00:07:48.351 END TEST app_cmdline 00:07:48.351 ************************************ 00:07:48.611 21:32:50 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:48.611 21:32:50 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:48.611 21:32:50 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:48.611 21:32:50 -- common/autotest_common.sh@10 -- # set +x 00:07:48.611 ************************************ 00:07:48.611 START TEST version 00:07:48.611 ************************************ 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:48.611 * Looking for test storage... 00:07:48.611 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1689 -- # lcov --version 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:48.611 21:32:50 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:48.611 21:32:50 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:48.611 21:32:50 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:48.611 21:32:50 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:48.611 21:32:50 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:48.611 21:32:50 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:48.611 21:32:50 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:48.611 21:32:50 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:48.611 21:32:50 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:48.611 21:32:50 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:48.611 21:32:50 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:48.611 21:32:50 version -- scripts/common.sh@344 -- # case "$op" in 00:07:48.611 21:32:50 version -- scripts/common.sh@345 -- # : 1 00:07:48.611 21:32:50 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:48.611 21:32:50 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:48.611 21:32:50 version -- scripts/common.sh@365 -- # decimal 1 00:07:48.611 21:32:50 version -- scripts/common.sh@353 -- # local d=1 00:07:48.611 21:32:50 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:48.611 21:32:50 version -- scripts/common.sh@355 -- # echo 1 00:07:48.611 21:32:50 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:48.611 21:32:50 version -- scripts/common.sh@366 -- # decimal 2 00:07:48.611 21:32:50 version -- scripts/common.sh@353 -- # local d=2 00:07:48.611 21:32:50 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:48.611 21:32:50 version -- scripts/common.sh@355 -- # echo 2 00:07:48.611 21:32:50 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:48.611 21:32:50 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:48.611 21:32:50 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:48.611 21:32:50 version -- scripts/common.sh@368 -- # return 0 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:48.611 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.611 --rc genhtml_branch_coverage=1 00:07:48.611 --rc genhtml_function_coverage=1 00:07:48.611 --rc genhtml_legend=1 00:07:48.611 --rc geninfo_all_blocks=1 00:07:48.611 --rc geninfo_unexecuted_blocks=1 00:07:48.611 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.611 ' 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:48.611 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.611 --rc genhtml_branch_coverage=1 00:07:48.611 --rc genhtml_function_coverage=1 00:07:48.611 --rc genhtml_legend=1 00:07:48.611 --rc geninfo_all_blocks=1 00:07:48.611 --rc geninfo_unexecuted_blocks=1 00:07:48.611 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.611 ' 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:48.611 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.611 --rc genhtml_branch_coverage=1 00:07:48.611 --rc genhtml_function_coverage=1 00:07:48.611 --rc genhtml_legend=1 00:07:48.611 --rc geninfo_all_blocks=1 00:07:48.611 --rc geninfo_unexecuted_blocks=1 00:07:48.611 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.611 ' 00:07:48.611 21:32:50 version -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:48.611 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:48.611 --rc genhtml_branch_coverage=1 00:07:48.611 --rc genhtml_function_coverage=1 00:07:48.611 --rc genhtml_legend=1 00:07:48.611 --rc geninfo_all_blocks=1 00:07:48.611 --rc geninfo_unexecuted_blocks=1 00:07:48.611 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:48.611 ' 00:07:48.611 21:32:50 version -- app/version.sh@17 -- # get_header_version major 00:07:48.611 21:32:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:48.611 21:32:50 version -- app/version.sh@14 -- # cut -f2 00:07:48.611 21:32:50 version -- app/version.sh@14 -- # tr -d '"' 00:07:48.611 21:32:50 version -- app/version.sh@17 -- # major=25 00:07:48.611 21:32:50 version -- app/version.sh@18 -- # get_header_version minor 00:07:48.611 21:32:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:48.611 21:32:50 version -- app/version.sh@14 -- # cut -f2 00:07:48.611 21:32:50 version -- app/version.sh@14 -- # tr -d '"' 00:07:48.611 21:32:50 version -- app/version.sh@18 -- # minor=1 00:07:48.611 21:32:50 version -- app/version.sh@19 -- # get_header_version patch 00:07:48.611 21:32:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:48.611 21:32:50 version -- app/version.sh@14 -- # cut -f2 00:07:48.611 21:32:50 version -- app/version.sh@14 -- # tr -d '"' 00:07:48.871 21:32:50 version -- app/version.sh@19 -- # patch=0 00:07:48.871 21:32:50 version -- app/version.sh@20 -- # get_header_version suffix 00:07:48.871 21:32:50 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:48.871 21:32:50 version -- app/version.sh@14 -- # cut -f2 00:07:48.871 21:32:50 version -- app/version.sh@14 -- # tr -d '"' 00:07:48.871 21:32:50 version -- app/version.sh@20 -- # suffix=-pre 00:07:48.871 21:32:50 version -- app/version.sh@22 -- # version=25.1 00:07:48.871 21:32:50 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:48.871 21:32:50 version -- app/version.sh@28 -- # version=25.1rc0 00:07:48.871 21:32:50 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:48.871 21:32:50 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:48.871 21:32:50 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:48.871 21:32:50 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:48.871 00:07:48.871 real 0m0.268s 00:07:48.871 user 0m0.149s 00:07:48.871 sys 0m0.175s 00:07:48.871 21:32:50 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:48.871 21:32:50 version -- common/autotest_common.sh@10 -- # set +x 00:07:48.871 ************************************ 00:07:48.871 END TEST version 00:07:48.871 ************************************ 00:07:48.871 21:32:50 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@194 -- # uname -s 00:07:48.871 21:32:50 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@256 -- # timing_exit lib 00:07:48.871 21:32:50 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:48.871 21:32:50 -- common/autotest_common.sh@10 -- # set +x 00:07:48.871 21:32:50 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@263 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@272 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@307 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@351 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:48.871 21:32:50 -- spdk/autotest.sh@362 -- # [[ 0 -eq 1 ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@370 -- # [[ 1 -eq 1 ]] 00:07:48.871 21:32:50 -- spdk/autotest.sh@371 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:48.871 21:32:50 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:48.871 21:32:50 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:48.871 21:32:50 -- common/autotest_common.sh@10 -- # set +x 00:07:48.871 ************************************ 00:07:48.871 START TEST llvm_fuzz 00:07:48.871 ************************************ 00:07:48.871 21:32:50 llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:48.871 * Looking for test storage... 00:07:49.131 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:49.131 21:32:50 llvm_fuzz -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:49.131 21:32:50 llvm_fuzz -- common/autotest_common.sh@1689 -- # lcov --version 00:07:49.131 21:32:50 llvm_fuzz -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:49.131 21:32:50 llvm_fuzz -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:49.131 21:32:50 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:49.131 21:32:50 llvm_fuzz -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.131 21:32:50 llvm_fuzz -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:49.132 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.132 --rc genhtml_branch_coverage=1 00:07:49.132 --rc genhtml_function_coverage=1 00:07:49.132 --rc genhtml_legend=1 00:07:49.132 --rc geninfo_all_blocks=1 00:07:49.132 --rc geninfo_unexecuted_blocks=1 00:07:49.132 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.132 ' 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:49.132 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.132 --rc genhtml_branch_coverage=1 00:07:49.132 --rc genhtml_function_coverage=1 00:07:49.132 --rc genhtml_legend=1 00:07:49.132 --rc geninfo_all_blocks=1 00:07:49.132 --rc geninfo_unexecuted_blocks=1 00:07:49.132 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.132 ' 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:49.132 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.132 --rc genhtml_branch_coverage=1 00:07:49.132 --rc genhtml_function_coverage=1 00:07:49.132 --rc genhtml_legend=1 00:07:49.132 --rc geninfo_all_blocks=1 00:07:49.132 --rc geninfo_unexecuted_blocks=1 00:07:49.132 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.132 ' 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:49.132 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.132 --rc genhtml_branch_coverage=1 00:07:49.132 --rc genhtml_function_coverage=1 00:07:49.132 --rc genhtml_legend=1 00:07:49.132 --rc geninfo_all_blocks=1 00:07:49.132 --rc geninfo_unexecuted_blocks=1 00:07:49.132 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.132 ' 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@548 -- # local fuzzers 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:49.132 21:32:50 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:49.132 21:32:50 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:49.132 ************************************ 00:07:49.132 START TEST nvmf_llvm_fuzz 00:07:49.132 ************************************ 00:07:49.132 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:49.132 * Looking for test storage... 00:07:49.132 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.132 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:49.132 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:49.132 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1689 -- # lcov --version 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:49.396 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.396 --rc genhtml_branch_coverage=1 00:07:49.396 --rc genhtml_function_coverage=1 00:07:49.396 --rc genhtml_legend=1 00:07:49.396 --rc geninfo_all_blocks=1 00:07:49.396 --rc geninfo_unexecuted_blocks=1 00:07:49.396 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.396 ' 00:07:49.396 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:49.396 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.396 --rc genhtml_branch_coverage=1 00:07:49.396 --rc genhtml_function_coverage=1 00:07:49.396 --rc genhtml_legend=1 00:07:49.396 --rc geninfo_all_blocks=1 00:07:49.396 --rc geninfo_unexecuted_blocks=1 00:07:49.396 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.396 ' 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:49.397 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.397 --rc genhtml_branch_coverage=1 00:07:49.397 --rc genhtml_function_coverage=1 00:07:49.397 --rc genhtml_legend=1 00:07:49.397 --rc geninfo_all_blocks=1 00:07:49.397 --rc geninfo_unexecuted_blocks=1 00:07:49.397 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.397 ' 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:49.397 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.397 --rc genhtml_branch_coverage=1 00:07:49.397 --rc genhtml_function_coverage=1 00:07:49.397 --rc genhtml_legend=1 00:07:49.397 --rc geninfo_all_blocks=1 00:07:49.397 --rc geninfo_unexecuted_blocks=1 00:07:49.397 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.397 ' 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:49.397 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:49.398 #define SPDK_CONFIG_H 00:07:49.398 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:49.398 #define SPDK_CONFIG_APPS 1 00:07:49.398 #define SPDK_CONFIG_ARCH native 00:07:49.398 #undef SPDK_CONFIG_ASAN 00:07:49.398 #undef SPDK_CONFIG_AVAHI 00:07:49.398 #undef SPDK_CONFIG_CET 00:07:49.398 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:49.398 #define SPDK_CONFIG_COVERAGE 1 00:07:49.398 #define SPDK_CONFIG_CROSS_PREFIX 00:07:49.398 #undef SPDK_CONFIG_CRYPTO 00:07:49.398 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:49.398 #undef SPDK_CONFIG_CUSTOMOCF 00:07:49.398 #undef SPDK_CONFIG_DAOS 00:07:49.398 #define SPDK_CONFIG_DAOS_DIR 00:07:49.398 #define SPDK_CONFIG_DEBUG 1 00:07:49.398 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:49.398 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:49.398 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:49.398 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:49.398 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:49.398 #undef SPDK_CONFIG_DPDK_UADK 00:07:49.398 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:49.398 #define SPDK_CONFIG_EXAMPLES 1 00:07:49.398 #undef SPDK_CONFIG_FC 00:07:49.398 #define SPDK_CONFIG_FC_PATH 00:07:49.398 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:49.398 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:49.398 #define SPDK_CONFIG_FSDEV 1 00:07:49.398 #undef SPDK_CONFIG_FUSE 00:07:49.398 #define SPDK_CONFIG_FUZZER 1 00:07:49.398 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:49.398 #undef SPDK_CONFIG_GOLANG 00:07:49.398 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:49.398 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:49.398 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:49.398 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:49.398 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:49.398 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:49.398 #undef SPDK_CONFIG_HAVE_LZ4 00:07:49.398 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:49.398 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:49.398 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:49.398 #define SPDK_CONFIG_IDXD 1 00:07:49.398 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:49.398 #undef SPDK_CONFIG_IPSEC_MB 00:07:49.398 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:49.398 #define SPDK_CONFIG_ISAL 1 00:07:49.398 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:49.398 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:49.398 #define SPDK_CONFIG_LIBDIR 00:07:49.398 #undef SPDK_CONFIG_LTO 00:07:49.398 #define SPDK_CONFIG_MAX_LCORES 128 00:07:49.398 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:07:49.398 #define SPDK_CONFIG_NVME_CUSE 1 00:07:49.398 #undef SPDK_CONFIG_OCF 00:07:49.398 #define SPDK_CONFIG_OCF_PATH 00:07:49.398 #define SPDK_CONFIG_OPENSSL_PATH 00:07:49.398 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:49.398 #define SPDK_CONFIG_PGO_DIR 00:07:49.398 #undef SPDK_CONFIG_PGO_USE 00:07:49.398 #define SPDK_CONFIG_PREFIX /usr/local 00:07:49.398 #undef SPDK_CONFIG_RAID5F 00:07:49.398 #undef SPDK_CONFIG_RBD 00:07:49.398 #define SPDK_CONFIG_RDMA 1 00:07:49.398 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:49.398 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:49.398 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:49.398 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:49.398 #undef SPDK_CONFIG_SHARED 00:07:49.398 #undef SPDK_CONFIG_SMA 00:07:49.398 #define SPDK_CONFIG_TESTS 1 00:07:49.398 #undef SPDK_CONFIG_TSAN 00:07:49.398 #define SPDK_CONFIG_UBLK 1 00:07:49.398 #define SPDK_CONFIG_UBSAN 1 00:07:49.398 #undef SPDK_CONFIG_UNIT_TESTS 00:07:49.398 #undef SPDK_CONFIG_URING 00:07:49.398 #define SPDK_CONFIG_URING_PATH 00:07:49.398 #undef SPDK_CONFIG_URING_ZNS 00:07:49.398 #undef SPDK_CONFIG_USDT 00:07:49.398 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:49.398 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:49.398 #define SPDK_CONFIG_VFIO_USER 1 00:07:49.398 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:49.398 #define SPDK_CONFIG_VHOST 1 00:07:49.398 #define SPDK_CONFIG_VIRTIO 1 00:07:49.398 #undef SPDK_CONFIG_VTUNE 00:07:49.398 #define SPDK_CONFIG_VTUNE_DIR 00:07:49.398 #define SPDK_CONFIG_WERROR 1 00:07:49.398 #define SPDK_CONFIG_WPDK_DIR 00:07:49.398 #undef SPDK_CONFIG_XNVME 00:07:49.398 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:49.398 21:32:50 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:49.398 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : main 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.399 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 3320572 ]] 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 3320572 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1674 -- # set_test_storage 2147483648 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.44T1fe 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.44T1fe/tests/nvmf /tmp/spdk.44T1fe 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=607576064 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4676853760 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=51425652736 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730611200 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=10304958464 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:49.400 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30860541952 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4763648 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340129792 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5992448 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30864478208 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=827392 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173048832 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173061120 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:07:49.401 * Looking for test storage... 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=51425652736 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=12519550976 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.401 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1676 -- # set -o errtrace 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1677 -- # shopt -s extdebug 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # true 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1683 -- # xtrace_fd 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1689 -- # lcov --version 00:07:49.401 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:07:49.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.662 --rc genhtml_branch_coverage=1 00:07:49.662 --rc genhtml_function_coverage=1 00:07:49.662 --rc genhtml_legend=1 00:07:49.662 --rc geninfo_all_blocks=1 00:07:49.662 --rc geninfo_unexecuted_blocks=1 00:07:49.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.662 ' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:07:49.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.662 --rc genhtml_branch_coverage=1 00:07:49.662 --rc genhtml_function_coverage=1 00:07:49.662 --rc genhtml_legend=1 00:07:49.662 --rc geninfo_all_blocks=1 00:07:49.662 --rc geninfo_unexecuted_blocks=1 00:07:49.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.662 ' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:07:49.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.662 --rc genhtml_branch_coverage=1 00:07:49.662 --rc genhtml_function_coverage=1 00:07:49.662 --rc genhtml_legend=1 00:07:49.662 --rc geninfo_all_blocks=1 00:07:49.662 --rc geninfo_unexecuted_blocks=1 00:07:49.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.662 ' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:07:49.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.662 --rc genhtml_branch_coverage=1 00:07:49.662 --rc genhtml_function_coverage=1 00:07:49.662 --rc genhtml_legend=1 00:07:49.662 --rc geninfo_all_blocks=1 00:07:49.662 --rc geninfo_unexecuted_blocks=1 00:07:49.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.662 ' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:49.662 21:32:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:49.662 [2024-10-27 21:32:51.260999] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:49.662 [2024-10-27 21:32:51.261080] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3320635 ] 00:07:49.922 [2024-10-27 21:32:51.593167] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:49.922 [2024-10-27 21:32:51.641370] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.181 [2024-10-27 21:32:51.660046] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.181 [2024-10-27 21:32:51.712433] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.181 [2024-10-27 21:32:51.728730] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:50.181 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.181 INFO: Seed: 766820258 00:07:50.181 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:07:50.181 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:07:50.181 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:50.181 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.181 #2 INITED exec/s: 0 rss: 65Mb 00:07:50.181 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.181 This may also happen if the target rejected all inputs we tried so far 00:07:50.181 [2024-10-27 21:32:51.805367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:50.181 [2024-10-27 21:32:51.805408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.441 NEW_FUNC[1/715]: 0x45ed58 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:50.441 NEW_FUNC[2/715]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.441 #3 NEW cov: 12160 ft: 12159 corp: 2/93b lim: 320 exec/s: 0 rss: 72Mb L: 92/92 MS: 1 InsertRepeatedBytes- 00:07:50.441 [2024-10-27 21:32:52.144806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757f25757575757 00:07:50.441 [2024-10-27 21:32:52.144849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.700 NEW_FUNC[1/1]: 0x1a444b8 in nvme_tcp_ctrlr_connect_qpair_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:2299 00:07:50.700 #19 NEW cov: 12297 ft: 12897 corp: 3/186b lim: 320 exec/s: 0 rss: 72Mb L: 93/93 MS: 1 InsertByte- 00:07:50.700 [2024-10-27 21:32:52.214704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:50.700 [2024-10-27 21:32:52.214733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.700 #20 NEW cov: 12303 ft: 13184 corp: 4/280b lim: 320 exec/s: 0 rss: 73Mb L: 94/94 MS: 1 InsertByte- 00:07:50.700 [2024-10-27 21:32:52.274714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757f25757575757 00:07:50.700 [2024-10-27 21:32:52.274744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.700 #21 NEW cov: 12388 ft: 13454 corp: 5/374b lim: 320 exec/s: 0 rss: 73Mb L: 94/94 MS: 1 ShuffleBytes- 00:07:50.700 [2024-10-27 21:32:52.344755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757f25757575757 00:07:50.700 [2024-10-27 21:32:52.344783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.700 #22 NEW cov: 12388 ft: 13635 corp: 6/467b lim: 320 exec/s: 0 rss: 73Mb L: 93/94 MS: 1 CopyPart- 00:07:50.700 [2024-10-27 21:32:52.394787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:50.700 [2024-10-27 21:32:52.394814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.700 #23 NEW cov: 12388 ft: 13808 corp: 7/561b lim: 320 exec/s: 0 rss: 73Mb L: 94/94 MS: 1 ChangeByte- 00:07:50.959 [2024-10-27 21:32:52.444981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:50.959 [2024-10-27 21:32:52.445009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.959 #24 NEW cov: 12388 ft: 13937 corp: 8/655b lim: 320 exec/s: 0 rss: 73Mb L: 94/94 MS: 1 ChangeBinInt- 00:07:50.959 [2024-10-27 21:32:52.494896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757f25757575757 00:07:50.960 [2024-10-27 21:32:52.494924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.960 #25 NEW cov: 12388 ft: 13967 corp: 9/748b lim: 320 exec/s: 0 rss: 73Mb L: 93/94 MS: 1 ShuffleBytes- 00:07:50.960 [2024-10-27 21:32:52.564993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:0a000000 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:50.960 [2024-10-27 21:32:52.565021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.960 #26 NEW cov: 12388 ft: 13998 corp: 10/848b lim: 320 exec/s: 0 rss: 73Mb L: 100/100 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\012"- 00:07:50.960 [2024-10-27 21:32:52.615009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:50.960 [2024-10-27 21:32:52.615037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.960 #27 NEW cov: 12388 ft: 14076 corp: 11/961b lim: 320 exec/s: 0 rss: 73Mb L: 113/113 MS: 1 CopyPart- 00:07:50.960 [2024-10-27 21:32:52.685243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:50.960 [2024-10-27 21:32:52.685271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.219 [2024-10-27 21:32:52.685397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:5 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.219 [2024-10-27 21:32:52.685417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.219 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:51.219 #28 NEW cov: 12411 ft: 14339 corp: 12/1139b lim: 320 exec/s: 0 rss: 73Mb L: 178/178 MS: 1 CopyPart- 00:07:51.219 [2024-10-27 21:32:52.755109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.219 [2024-10-27 21:32:52.755138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.219 #29 NEW cov: 12411 ft: 14349 corp: 13/1253b lim: 320 exec/s: 29 rss: 73Mb L: 114/178 MS: 1 InsertByte- 00:07:51.219 [2024-10-27 21:32:52.825208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d6) qid:0 cid:4 nsid:dededede cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0xdededededededede 00:07:51.219 [2024-10-27 21:32:52.825235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.219 #33 NEW cov: 12430 ft: 14419 corp: 14/1327b lim: 320 exec/s: 33 rss: 73Mb L: 74/178 MS: 4 InsertByte-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:51.219 [2024-10-27 21:32:52.875206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f25757575f5757 00:07:51.219 [2024-10-27 21:32:52.875234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.219 #34 NEW cov: 12430 ft: 14442 corp: 15/1421b lim: 320 exec/s: 34 rss: 73Mb L: 94/178 MS: 1 ChangeBit- 00:07:51.219 [2024-10-27 21:32:52.925346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.219 [2024-10-27 21:32:52.925377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.478 #35 NEW cov: 12430 ft: 14463 corp: 16/1513b lim: 320 exec/s: 35 rss: 73Mb L: 92/178 MS: 1 CopyPart- 00:07:51.478 [2024-10-27 21:32:52.975523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d6) qid:0 cid:4 nsid:dededede cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.478 [2024-10-27 21:32:52.975552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.478 [2024-10-27 21:32:52.975689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:5 nsid:0 cdw10:dededede cdw11:dededede SGL TRANSPORT DATA BLOCK TRANSPORT 0xdededededededede 00:07:51.479 [2024-10-27 21:32:52.975708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.479 #36 NEW cov: 12430 ft: 14525 corp: 17/1701b lim: 320 exec/s: 36 rss: 73Mb L: 188/188 MS: 1 CrossOver- 00:07:51.479 [2024-10-27 21:32:53.045385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.479 [2024-10-27 21:32:53.045417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 #37 NEW cov: 12430 ft: 14560 corp: 18/1825b lim: 320 exec/s: 37 rss: 73Mb L: 124/188 MS: 1 CopyPart- 00:07:51.479 [2024-10-27 21:32:53.095423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.479 [2024-10-27 21:32:53.095453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 #38 NEW cov: 12430 ft: 14573 corp: 19/1917b lim: 320 exec/s: 38 rss: 73Mb L: 92/188 MS: 1 ChangeByte- 00:07:51.479 [2024-10-27 21:32:53.145427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:0a000000 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.479 [2024-10-27 21:32:53.145458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 #39 NEW cov: 12430 ft: 14615 corp: 20/2015b lim: 320 exec/s: 39 rss: 73Mb L: 98/188 MS: 1 EraseBytes- 00:07:51.479 [2024-10-27 21:32:53.195869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:1a1a1a1a cdw11:1a1a1a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x1a1a1a1a1a1a1a1a 00:07:51.479 [2024-10-27 21:32:53.195899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 [2024-10-27 21:32:53.196033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:1a1a1a1a cdw10:1a1a1a1a cdw11:5757571a SGL TRANSPORT DATA BLOCK TRANSPORT 0x1a1a1a1a1a1a1a1a 00:07:51.479 [2024-10-27 21:32:53.196052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.479 [2024-10-27 21:32:53.196180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:6 nsid:5757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.479 [2024-10-27 21:32:53.196197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.738 #40 NEW cov: 12430 ft: 14790 corp: 21/2213b lim: 320 exec/s: 40 rss: 73Mb L: 198/198 MS: 1 InsertRepeatedBytes- 00:07:51.738 [2024-10-27 21:32:53.245538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:51.738 [2024-10-27 21:32:53.245566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 #41 NEW cov: 12430 ft: 14801 corp: 22/2307b lim: 320 exec/s: 41 rss: 73Mb L: 94/198 MS: 1 ChangeBit- 00:07:51.738 [2024-10-27 21:32:53.295488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.738 [2024-10-27 21:32:53.295516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 #42 NEW cov: 12430 ft: 14816 corp: 23/2421b lim: 320 exec/s: 42 rss: 73Mb L: 114/198 MS: 1 ChangeBinInt- 00:07:51.738 [2024-10-27 21:32:53.365551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:04045757 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:51.738 [2024-10-27 21:32:53.365579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 #45 NEW cov: 12430 ft: 14846 corp: 24/2518b lim: 320 exec/s: 45 rss: 73Mb L: 97/198 MS: 3 EraseBytes-EraseBytes-InsertRepeatedBytes- 00:07:51.738 [2024-10-27 21:32:53.415576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:0a000000 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.738 [2024-10-27 21:32:53.415603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 #46 NEW cov: 12430 ft: 14864 corp: 25/2585b lim: 320 exec/s: 46 rss: 74Mb L: 67/198 MS: 1 EraseBytes- 00:07:51.997 [2024-10-27 21:32:53.485530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:51.997 [2024-10-27 21:32:53.485558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.997 #47 NEW cov: 12430 ft: 14873 corp: 26/2683b lim: 320 exec/s: 47 rss: 74Mb L: 98/198 MS: 1 CMP- DE: ";\001\000\000"- 00:07:51.997 [2024-10-27 21:32:53.555685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:07:51.997 [2024-10-27 21:32:53.555714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.997 #53 NEW cov: 12430 ft: 14877 corp: 27/2747b lim: 320 exec/s: 53 rss: 74Mb L: 64/198 MS: 1 EraseBytes- 00:07:51.997 [2024-10-27 21:32:53.625757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:51.997 [2024-10-27 21:32:53.625784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.997 #54 NEW cov: 12430 ft: 14889 corp: 28/2826b lim: 320 exec/s: 54 rss: 74Mb L: 79/198 MS: 1 EraseBytes- 00:07:51.997 [2024-10-27 21:32:53.695762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x57f2575757575757 00:07:51.997 [2024-10-27 21:32:53.695790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.257 #55 NEW cov: 12430 ft: 14935 corp: 29/2948b lim: 320 exec/s: 55 rss: 74Mb L: 122/198 MS: 1 CrossOver- 00:07:52.257 [2024-10-27 21:32:53.765813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (57) qid:0 cid:4 nsid:57575757 cdw10:57575757 cdw11:57575757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5757575757575757 00:07:52.257 [2024-10-27 21:32:53.765841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.257 #56 NEW cov: 12430 ft: 14942 corp: 30/3031b lim: 320 exec/s: 28 rss: 74Mb L: 83/198 MS: 1 PersAutoDict- DE: ";\001\000\000"- 00:07:52.257 #56 DONE cov: 12430 ft: 14942 corp: 30/3031b lim: 320 exec/s: 28 rss: 74Mb 00:07:52.257 ###### Recommended dictionary. ###### 00:07:52.257 "\000\000\000\000\000\000\000\012" # Uses: 0 00:07:52.257 ";\001\000\000" # Uses: 1 00:07:52.257 ###### End of recommended dictionary. ###### 00:07:52.257 Done 56 runs in 2 second(s) 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:52.257 21:32:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:52.257 [2024-10-27 21:32:53.950480] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:52.257 [2024-10-27 21:32:53.950541] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3321164 ] 00:07:52.826 [2024-10-27 21:32:54.263449] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:52.826 [2024-10-27 21:32:54.310621] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.826 [2024-10-27 21:32:54.327151] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.826 [2024-10-27 21:32:54.379813] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.826 [2024-10-27 21:32:54.396129] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:52.826 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.826 INFO: Seed: 3433822452 00:07:52.826 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:07:52.826 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:07:52.826 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:52.826 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.826 #2 INITED exec/s: 0 rss: 65Mb 00:07:52.826 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.826 This may also happen if the target rejected all inputs we tried so far 00:07:52.826 [2024-10-27 21:32:54.451105] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:52.826 [2024-10-27 21:32:54.451318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.826 [2024-10-27 21:32:54.451348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.086 NEW_FUNC[1/716]: 0x45f658 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:53.087 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.087 #11 NEW cov: 12251 ft: 12246 corp: 2/10b lim: 30 exec/s: 0 rss: 72Mb L: 9/9 MS: 4 CopyPart-CrossOver-CrossOver-InsertRepeatedBytes- 00:07:53.087 [2024-10-27 21:32:54.761261] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.087 [2024-10-27 21:32:54.761497] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:53.087 [2024-10-27 21:32:54.761724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.087 [2024-10-27 21:32:54.761755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.087 [2024-10-27 21:32:54.761815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.087 [2024-10-27 21:32:54.761830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.087 [2024-10-27 21:32:54.761885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.087 [2024-10-27 21:32:54.761899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.087 #17 NEW cov: 12404 ft: 13146 corp: 3/28b lim: 30 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:07:53.346 [2024-10-27 21:32:54.821219] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.346 [2024-10-27 21:32:54.821440] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000420a 00:07:53.346 [2024-10-27 21:32:54.821652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.346 [2024-10-27 21:32:54.821679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:54.821737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.821751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:54.821804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.821818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.347 #18 NEW cov: 12410 ft: 13306 corp: 4/47b lim: 30 exec/s: 0 rss: 73Mb L: 19/19 MS: 1 CrossOver- 00:07:53.347 [2024-10-27 21:32:54.881237] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:53.347 [2024-10-27 21:32:54.881354] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:53.347 [2024-10-27 21:32:54.881569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ab602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.881595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:54.881653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b6b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.881667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.347 #19 NEW cov: 12495 ft: 13958 corp: 5/62b lim: 30 exec/s: 0 rss: 73Mb L: 15/19 MS: 1 InsertRepeatedBytes- 00:07:53.347 [2024-10-27 21:32:54.921223] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10508) > buf size (4096) 00:07:53.347 [2024-10-27 21:32:54.921340] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4242 00:07:53.347 [2024-10-27 21:32:54.921549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.921575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:54.921632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.921646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.347 #25 NEW cov: 12495 ft: 14055 corp: 6/79b lim: 30 exec/s: 0 rss: 73Mb L: 17/19 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\020"- 00:07:53.347 [2024-10-27 21:32:54.961259] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.347 [2024-10-27 21:32:54.961587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.961612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:54.961669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:54.961683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.347 #26 NEW cov: 12495 ft: 14116 corp: 7/91b lim: 30 exec/s: 0 rss: 73Mb L: 12/19 MS: 1 EraseBytes- 00:07:53.347 [2024-10-27 21:32:55.021305] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:53.347 [2024-10-27 21:32:55.021424] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:53.347 [2024-10-27 21:32:55.021653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:55.021679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:55.021738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:55.021752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.347 #27 NEW cov: 12495 ft: 14263 corp: 8/108b lim: 30 exec/s: 0 rss: 73Mb L: 17/19 MS: 1 CopyPart- 00:07:53.347 [2024-10-27 21:32:55.061391] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.347 [2024-10-27 21:32:55.061506] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:53.347 [2024-10-27 21:32:55.061615] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:53.347 [2024-10-27 21:32:55.061721] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (181252) > buf size (4096) 00:07:53.347 [2024-10-27 21:32:55.061930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:55.061961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:55.062017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:55.062032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:55.062086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b1b181b1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:55.062099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.347 [2024-10-27 21:32:55.062153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:b1000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.347 [2024-10-27 21:32:55.062167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.607 #28 NEW cov: 12495 ft: 14782 corp: 9/136b lim: 30 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:53.607 [2024-10-27 21:32:55.101320] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004242 00:07:53.607 [2024-10-27 21:32:55.101531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42428142 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.101556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.607 #30 NEW cov: 12495 ft: 14856 corp: 10/142b lim: 30 exec/s: 0 rss: 73Mb L: 6/28 MS: 2 EraseBytes-InsertByte- 00:07:53.607 [2024-10-27 21:32:55.141479] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.607 [2024-10-27 21:32:55.141695] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:53.607 [2024-10-27 21:32:55.141902] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a42 00:07:53.607 [2024-10-27 21:32:55.142125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.142151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.142209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.142223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.142276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.142289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.142345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.142359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.142414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.142427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.607 #31 NEW cov: 12495 ft: 14970 corp: 11/172b lim: 30 exec/s: 0 rss: 73Mb L: 30/30 MS: 1 CopyPart- 00:07:53.607 [2024-10-27 21:32:55.181431] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.607 [2024-10-27 21:32:55.181650] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:53.607 [2024-10-27 21:32:55.181863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.181889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.181950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.181964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.182019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.182033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.607 #32 NEW cov: 12495 ft: 15028 corp: 12/192b lim: 30 exec/s: 0 rss: 73Mb L: 20/30 MS: 1 InsertByte- 00:07:53.607 [2024-10-27 21:32:55.221474] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:53.607 [2024-10-27 21:32:55.221592] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:53.607 [2024-10-27 21:32:55.221800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.221826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.221881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.221896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.607 #33 NEW cov: 12495 ft: 15051 corp: 13/209b lim: 30 exec/s: 0 rss: 73Mb L: 17/30 MS: 1 ChangeByte- 00:07:53.607 [2024-10-27 21:32:55.281547] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.607 [2024-10-27 21:32:55.281665] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:53.607 [2024-10-27 21:32:55.281775] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:53.607 [2024-10-27 21:32:55.281886] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (181252) > buf size (4096) 00:07:53.607 [2024-10-27 21:32:55.282111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.282137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.282197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f9008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.282212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.607 [2024-10-27 21:32:55.282267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b1b181b1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.607 [2024-10-27 21:32:55.282281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.608 [2024-10-27 21:32:55.282335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:b1000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.608 [2024-10-27 21:32:55.282349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.608 #34 NEW cov: 12495 ft: 15074 corp: 14/237b lim: 30 exec/s: 0 rss: 73Mb L: 28/30 MS: 1 ChangeBinInt- 00:07:53.867 [2024-10-27 21:32:55.341551] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (67852) > buf size (4096) 00:07:53.867 [2024-10-27 21:32:55.341772] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004242 00:07:53.867 [2024-10-27 21:32:55.341996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42420042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.342022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.342079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.342093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.342150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.342167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.867 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:53.867 #35 NEW cov: 12518 ft: 15127 corp: 15/255b lim: 30 exec/s: 0 rss: 73Mb L: 18/30 MS: 1 InsertRepeatedBytes- 00:07:53.867 [2024-10-27 21:32:55.401535] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:53.867 [2024-10-27 21:32:55.401766] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000420a 00:07:53.867 [2024-10-27 21:32:55.401976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.402002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.402060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.402074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.402130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000243 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.402144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.867 #36 NEW cov: 12518 ft: 15164 corp: 16/274b lim: 30 exec/s: 0 rss: 73Mb L: 19/30 MS: 1 ChangeBit- 00:07:53.867 [2024-10-27 21:32:55.441532] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:53.867 [2024-10-27 21:32:55.441655] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:53.867 [2024-10-27 21:32:55.441860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.441886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.441949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b6b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.441963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.867 #37 NEW cov: 12518 ft: 15195 corp: 17/289b lim: 30 exec/s: 37 rss: 73Mb L: 15/30 MS: 1 ChangeByte- 00:07:53.867 [2024-10-27 21:32:55.501572] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (67852) > buf size (4096) 00:07:53.867 [2024-10-27 21:32:55.501886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42420042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.501912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.501973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.501987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.561662] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (67852) > buf size (4096) 00:07:53.867 [2024-10-27 21:32:55.561784] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001717 00:07:53.867 [2024-10-27 21:32:55.561908] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (810080) > buf size (4096) 00:07:53.867 [2024-10-27 21:32:55.562130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42420042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.562159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.562218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:17178317 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.562232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.867 [2024-10-27 21:32:55.562289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:17178317 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.867 [2024-10-27 21:32:55.562302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.867 #39 NEW cov: 12518 ft: 15222 corp: 18/312b lim: 30 exec/s: 39 rss: 73Mb L: 23/30 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:54.127 [2024-10-27 21:32:55.601561] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:54.127 [2024-10-27 21:32:55.601784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.127 [2024-10-27 21:32:55.601809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.127 #40 NEW cov: 12518 ft: 15280 corp: 19/323b lim: 30 exec/s: 40 rss: 74Mb L: 11/30 MS: 1 EraseBytes- 00:07:54.127 [2024-10-27 21:32:55.661642] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:54.127 [2024-10-27 21:32:55.661779] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:54.127 [2024-10-27 21:32:55.661890] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:54.127 [2024-10-27 21:32:55.662113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.127 [2024-10-27 21:32:55.662139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.127 [2024-10-27 21:32:55.662196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.662210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.662267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b6b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.662280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.128 #41 NEW cov: 12518 ft: 15307 corp: 20/345b lim: 30 exec/s: 41 rss: 74Mb L: 22/30 MS: 1 CrossOver- 00:07:54.128 [2024-10-27 21:32:55.701706] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.128 [2024-10-27 21:32:55.701931] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:54.128 [2024-10-27 21:32:55.702054] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5 00:07:54.128 [2024-10-27 21:32:55.702166] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a42 00:07:54.128 [2024-10-27 21:32:55.702384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.702410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.702467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.702481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.702539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.702552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.702604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.702618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.702669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.702682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.128 #42 NEW cov: 12518 ft: 15343 corp: 21/375b lim: 30 exec/s: 42 rss: 74Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:54.128 [2024-10-27 21:32:55.761766] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.128 [2024-10-27 21:32:55.761888] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:54.128 [2024-10-27 21:32:55.762006] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:54.128 [2024-10-27 21:32:55.762118] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (148484) > buf size (4096) 00:07:54.128 [2024-10-27 21:32:55.762343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.762369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.762427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f9008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.762440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.762498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b1b181b1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.762512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.762567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:91000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.762581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.128 #43 NEW cov: 12518 ft: 15358 corp: 22/403b lim: 30 exec/s: 43 rss: 74Mb L: 28/30 MS: 1 ChangeBit- 00:07:54.128 [2024-10-27 21:32:55.821785] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (67852) > buf size (4096) 00:07:54.128 [2024-10-27 21:32:55.822030] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001717 00:07:54.128 [2024-10-27 21:32:55.822146] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (810080) > buf size (4096) 00:07:54.128 [2024-10-27 21:32:55.822373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42420042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.822399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.822459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.822473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.822531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:17178317 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.822545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.128 [2024-10-27 21:32:55.822600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:17178317 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.128 [2024-10-27 21:32:55.822614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.388 #44 NEW cov: 12518 ft: 15406 corp: 23/432b lim: 30 exec/s: 44 rss: 74Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:07:54.388 [2024-10-27 21:32:55.881816] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (67852) > buf size (4096) 00:07:54.388 [2024-10-27 21:32:55.882145] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x31 00:07:54.388 [2024-10-27 21:32:55.882363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42420042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.882389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.882444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.882458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.882512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.882525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.882577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.882590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.388 #45 NEW cov: 12518 ft: 15443 corp: 24/458b lim: 30 exec/s: 45 rss: 74Mb L: 26/30 MS: 1 CrossOver- 00:07:54.388 [2024-10-27 21:32:55.921746] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000b6b6 00:07:54.388 [2024-10-27 21:32:55.921881] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:54.388 [2024-10-27 21:32:55.922110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ab683b6 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.922136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.922194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b6b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.922209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.388 #46 NEW cov: 12518 ft: 15492 corp: 25/473b lim: 30 exec/s: 46 rss: 74Mb L: 15/30 MS: 1 ChangeBit- 00:07:54.388 [2024-10-27 21:32:55.961838] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (67852) > buf size (4096) 00:07:54.388 [2024-10-27 21:32:55.962169] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (48) > len (4) 00:07:54.388 [2024-10-27 21:32:55.962396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:42420042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.962420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.962479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.962493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.962546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.962559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:55.962614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:55.962628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.388 #47 NEW cov: 12524 ft: 15509 corp: 26/499b lim: 30 exec/s: 47 rss: 74Mb L: 26/30 MS: 1 ChangeASCIIInt- 00:07:54.388 [2024-10-27 21:32:56.021829] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.388 [2024-10-27 21:32:56.022099] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:54.388 [2024-10-27 21:32:56.022321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:56.022347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:56.022402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:56.022417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:56.022471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000208 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:56.022483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.388 #48 NEW cov: 12524 ft: 15576 corp: 27/519b lim: 30 exec/s: 48 rss: 74Mb L: 20/30 MS: 1 ChangeBit- 00:07:54.388 [2024-10-27 21:32:56.081822] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.388 [2024-10-27 21:32:56.082076] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004242 00:07:54.388 [2024-10-27 21:32:56.082304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:56.082331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:56.082387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:56.082401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.388 [2024-10-27 21:32:56.082457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000208 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.388 [2024-10-27 21:32:56.082470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.649 #49 NEW cov: 12524 ft: 15602 corp: 28/539b lim: 30 exec/s: 49 rss: 74Mb L: 20/30 MS: 1 ShuffleBytes- 00:07:54.649 [2024-10-27 21:32:56.141935] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.142174] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.142386] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a42 00:07:54.649 [2024-10-27 21:32:56.142622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.142648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.142702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.142715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.142767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.142781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.142831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.142844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.142895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.142909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.649 #50 NEW cov: 12524 ft: 15679 corp: 29/569b lim: 30 exec/s: 50 rss: 74Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:54.649 [2024-10-27 21:32:56.181961] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.182189] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.182397] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a42 00:07:54.649 [2024-10-27 21:32:56.182620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.182646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.182703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.182717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.182770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.182784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.182837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.182851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.182907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.182920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.649 #51 NEW cov: 12524 ft: 15705 corp: 30/599b lim: 30 exec/s: 51 rss: 74Mb L: 30/30 MS: 1 ChangeBit- 00:07:54.649 [2024-10-27 21:32:56.221953] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.222171] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xa 00:07:54.649 [2024-10-27 21:32:56.222391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.222416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.222475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.222489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.222544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000042 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.222558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.649 #52 NEW cov: 12524 ft: 15707 corp: 31/618b lim: 30 exec/s: 52 rss: 74Mb L: 19/30 MS: 1 CopyPart- 00:07:54.649 [2024-10-27 21:32:56.261869] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6b6 00:07:54.649 [2024-10-27 21:32:56.262089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00b602b6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.262114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.649 #53 NEW cov: 12524 ft: 15718 corp: 32/629b lim: 30 exec/s: 53 rss: 74Mb L: 11/30 MS: 1 CopyPart- 00:07:54.649 [2024-10-27 21:32:56.321982] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.322101] ctrlr.c:2698:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:07:54.649 [2024-10-27 21:32:56.322211] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.322415] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a42 00:07:54.649 [2024-10-27 21:32:56.322641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.322667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.322725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.322739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.322793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.322807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.322860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.322874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.649 [2024-10-27 21:32:56.322927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.322947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.649 #54 NEW cov: 12524 ft: 15727 corp: 33/659b lim: 30 exec/s: 54 rss: 74Mb L: 30/30 MS: 1 ChangeBit- 00:07:54.649 [2024-10-27 21:32:56.361946] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.649 [2024-10-27 21:32:56.362187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.649 [2024-10-27 21:32:56.362212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.910 #55 NEW cov: 12524 ft: 15738 corp: 34/669b lim: 30 exec/s: 55 rss: 75Mb L: 10/30 MS: 1 EraseBytes- 00:07:54.910 [2024-10-27 21:32:56.422074] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534796) > buf size (4096) 00:07:54.910 [2024-10-27 21:32:56.422293] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (525192) > buf size (4096) 00:07:54.910 [2024-10-27 21:32:56.422512] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a42 00:07:54.910 [2024-10-27 21:32:56.422739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.910 [2024-10-27 21:32:56.422766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.910 [2024-10-27 21:32:56.422825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.910 [2024-10-27 21:32:56.422840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.910 [2024-10-27 21:32:56.422895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00e10242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.910 [2024-10-27 21:32:56.422908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.910 [2024-10-27 21:32:56.422954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.910 [2024-10-27 21:32:56.422968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.910 [2024-10-27 21:32:56.423021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:42420242 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.910 [2024-10-27 21:32:56.423034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.910 #56 NEW cov: 12524 ft: 15745 corp: 35/699b lim: 30 exec/s: 28 rss: 75Mb L: 30/30 MS: 1 ChangeByte- 00:07:54.910 #56 DONE cov: 12524 ft: 15745 corp: 35/699b lim: 30 exec/s: 28 rss: 75Mb 00:07:54.910 ###### Recommended dictionary. ###### 00:07:54.910 "\001\000\000\000\000\000\000\020" # Uses: 0 00:07:54.910 ###### End of recommended dictionary. ###### 00:07:54.910 Done 56 runs in 2 second(s) 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:54.910 21:32:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:54.910 [2024-10-27 21:32:56.610293] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:54.910 [2024-10-27 21:32:56.610372] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3321705 ] 00:07:55.479 [2024-10-27 21:32:56.924301] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:55.479 [2024-10-27 21:32:56.969848] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.479 [2024-10-27 21:32:56.988168] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.479 [2024-10-27 21:32:57.040393] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.479 [2024-10-27 21:32:57.056698] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:55.479 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.479 INFO: Seed: 1799842617 00:07:55.479 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:07:55.479 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:07:55.479 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:55.479 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.479 #2 INITED exec/s: 0 rss: 64Mb 00:07:55.479 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.479 This may also happen if the target rejected all inputs we tried so far 00:07:55.479 [2024-10-27 21:32:57.122335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-10-27 21:32:57.122362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.479 [2024-10-27 21:32:57.122419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-10-27 21:32:57.122433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.479 [2024-10-27 21:32:57.122486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-10-27 21:32:57.122499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.479 [2024-10-27 21:32:57.122556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-10-27 21:32:57.122569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.739 NEW_FUNC[1/715]: 0x462108 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:55.739 NEW_FUNC[2/715]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.739 #3 NEW cov: 12207 ft: 12200 corp: 2/32b lim: 35 exec/s: 0 rss: 71Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:55.739 [2024-10-27 21:32:57.452233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-10-27 21:32:57.452289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.739 [2024-10-27 21:32:57.452375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-10-27 21:32:57.452402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.999 #6 NEW cov: 12320 ft: 13504 corp: 3/47b lim: 35 exec/s: 0 rss: 72Mb L: 15/31 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:55.999 [2024-10-27 21:32:57.502026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.502052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.999 [2024-10-27 21:32:57.502105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.502118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.999 #10 NEW cov: 12326 ft: 13848 corp: 4/65b lim: 35 exec/s: 0 rss: 72Mb L: 18/31 MS: 4 InsertByte-CopyPart-InsertByte-InsertRepeatedBytes- 00:07:55.999 [2024-10-27 21:32:57.541926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.541957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.999 #11 NEW cov: 12411 ft: 14349 corp: 5/73b lim: 35 exec/s: 0 rss: 72Mb L: 8/31 MS: 1 EraseBytes- 00:07:55.999 [2024-10-27 21:32:57.601906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91320091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.601931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.999 #12 NEW cov: 12411 ft: 14437 corp: 6/81b lim: 35 exec/s: 0 rss: 72Mb L: 8/31 MS: 1 ChangeByte- 00:07:55.999 [2024-10-27 21:32:57.662092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.662117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.999 [2024-10-27 21:32:57.662173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.662186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.999 #13 NEW cov: 12411 ft: 14494 corp: 7/98b lim: 35 exec/s: 0 rss: 72Mb L: 17/31 MS: 1 CopyPart- 00:07:55.999 [2024-10-27 21:32:57.702133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.702161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.999 [2024-10-27 21:32:57.702214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00fd cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-10-27 21:32:57.702227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.259 #14 NEW cov: 12411 ft: 14571 corp: 8/116b lim: 35 exec/s: 0 rss: 72Mb L: 18/31 MS: 1 ChangeBinInt- 00:07:56.259 [2024-10-27 21:32:57.762121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.762146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.259 [2024-10-27 21:32:57.762215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.762239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.259 #15 NEW cov: 12411 ft: 14622 corp: 9/133b lim: 35 exec/s: 0 rss: 72Mb L: 17/31 MS: 1 ChangeBit- 00:07:56.259 [2024-10-27 21:32:57.822132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.822157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.259 [2024-10-27 21:32:57.822210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.822223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.259 #16 NEW cov: 12411 ft: 14666 corp: 10/150b lim: 35 exec/s: 0 rss: 72Mb L: 17/31 MS: 1 ChangeBinInt- 00:07:56.259 [2024-10-27 21:32:57.882060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.882084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.259 #17 NEW cov: 12411 ft: 14712 corp: 11/158b lim: 35 exec/s: 0 rss: 72Mb L: 8/31 MS: 1 ChangeBinInt- 00:07:56.259 [2024-10-27 21:32:57.922055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.922080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.259 #18 NEW cov: 12411 ft: 14808 corp: 12/167b lim: 35 exec/s: 0 rss: 72Mb L: 9/31 MS: 1 InsertByte- 00:07:56.259 [2024-10-27 21:32:57.982156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.259 [2024-10-27 21:32:57.982182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.518 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:56.518 #19 NEW cov: 12434 ft: 14893 corp: 13/176b lim: 35 exec/s: 0 rss: 73Mb L: 9/31 MS: 1 EraseBytes- 00:07:56.518 [2024-10-27 21:32:58.022506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.022532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.518 [2024-10-27 21:32:58.022586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.022603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.518 [2024-10-27 21:32:58.022656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.022669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.518 [2024-10-27 21:32:58.022722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.022734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.518 #20 NEW cov: 12434 ft: 14955 corp: 14/209b lim: 35 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 CopyPart- 00:07:56.518 [2024-10-27 21:32:58.062134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.062159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.518 #21 NEW cov: 12434 ft: 15007 corp: 15/217b lim: 35 exec/s: 21 rss: 73Mb L: 8/33 MS: 1 ChangeBinInt- 00:07:56.518 [2024-10-27 21:32:58.122186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:29910091 cdw11:91003291 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.122211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.518 #22 NEW cov: 12434 ft: 15027 corp: 16/226b lim: 35 exec/s: 22 rss: 73Mb L: 9/33 MS: 1 InsertByte- 00:07:56.518 [2024-10-27 21:32:58.162277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.162301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.518 [2024-10-27 21:32:58.162368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:13770013 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.162382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.518 #23 NEW cov: 12434 ft: 15044 corp: 17/246b lim: 35 exec/s: 23 rss: 73Mb L: 20/33 MS: 1 InsertRepeatedBytes- 00:07:56.518 [2024-10-27 21:32:58.222110] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:56.518 [2024-10-27 21:32:58.222322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:00009140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.222348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.518 [2024-10-27 21:32:58.222401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:91000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.518 [2024-10-27 21:32:58.222417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.777 #24 NEW cov: 12445 ft: 15112 corp: 18/261b lim: 35 exec/s: 24 rss: 73Mb L: 15/33 MS: 1 CMP- DE: "@\000\000\000\000\000\000\000"- 00:07:56.777 [2024-10-27 21:32:58.262121] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:56.777 [2024-10-27 21:32:58.262354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91000091 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.262378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.777 [2024-10-27 21:32:58.262438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000f0000 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.262454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.777 #25 NEW cov: 12445 ft: 15189 corp: 19/276b lim: 35 exec/s: 25 rss: 73Mb L: 15/33 MS: 1 ChangeBinInt- 00:07:56.777 [2024-10-27 21:32:58.302189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.302212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.777 #26 NEW cov: 12445 ft: 15198 corp: 20/284b lim: 35 exec/s: 26 rss: 73Mb L: 8/33 MS: 1 ShuffleBytes- 00:07:56.777 [2024-10-27 21:32:58.342208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.342232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.777 #27 NEW cov: 12445 ft: 15224 corp: 21/297b lim: 35 exec/s: 27 rss: 73Mb L: 13/33 MS: 1 CopyPart- 00:07:56.777 [2024-10-27 21:32:58.382610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.382635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.777 [2024-10-27 21:32:58.382689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.382702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.777 [2024-10-27 21:32:58.382751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.382765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.777 [2024-10-27 21:32:58.382816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0a910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.382829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.777 #28 NEW cov: 12445 ft: 15230 corp: 22/326b lim: 35 exec/s: 28 rss: 73Mb L: 29/33 MS: 1 CopyPart- 00:07:56.777 [2024-10-27 21:32:58.422267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91960091 cdw11:9100910a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.422291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.777 #29 NEW cov: 12445 ft: 15295 corp: 23/334b lim: 35 exec/s: 29 rss: 73Mb L: 8/33 MS: 1 ShuffleBytes- 00:07:56.777 [2024-10-27 21:32:58.462268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:9191000a cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.777 [2024-10-27 21:32:58.462297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.777 #30 NEW cov: 12445 ft: 15309 corp: 24/341b lim: 35 exec/s: 30 rss: 73Mb L: 7/33 MS: 1 InsertRepeatedBytes- 00:07:56.778 [2024-10-27 21:32:58.502315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6e6e006f cdw11:6e006e6e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.778 [2024-10-27 21:32:58.502340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 #31 NEW cov: 12445 ft: 15342 corp: 25/349b lim: 35 exec/s: 31 rss: 73Mb L: 8/33 MS: 1 ChangeBinInt- 00:07:57.037 [2024-10-27 21:32:58.542265] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.037 [2024-10-27 21:32:58.542498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91400091 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.542521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 [2024-10-27 21:32:58.542575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.542592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.037 #32 NEW cov: 12445 ft: 15358 corp: 26/365b lim: 35 exec/s: 32 rss: 73Mb L: 16/33 MS: 1 PersAutoDict- DE: "@\000\000\000\000\000\000\000"- 00:07:57.037 [2024-10-27 21:32:58.582357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.582381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 #33 NEW cov: 12445 ft: 15413 corp: 27/377b lim: 35 exec/s: 33 rss: 73Mb L: 12/33 MS: 1 EraseBytes- 00:07:57.037 [2024-10-27 21:32:58.642640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91000091 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.642664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 [2024-10-27 21:32:58.642717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.642731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.037 [2024-10-27 21:32:58.642785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:91910091 cdw11:93009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.642798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.037 #34 NEW cov: 12445 ft: 15611 corp: 28/398b lim: 35 exec/s: 34 rss: 73Mb L: 21/33 MS: 1 InsertRepeatedBytes- 00:07:57.037 [2024-10-27 21:32:58.682784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.682809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 [2024-10-27 21:32:58.682859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.682873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.037 [2024-10-27 21:32:58.682925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.682938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.037 [2024-10-27 21:32:58.682994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.683006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.037 #35 NEW cov: 12445 ft: 15621 corp: 29/431b lim: 35 exec/s: 35 rss: 73Mb L: 33/33 MS: 1 CMP- DE: "\377\377\000e"- 00:07:57.037 [2024-10-27 21:32:58.742417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91960091 cdw11:0000910a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.037 [2024-10-27 21:32:58.742442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 #36 NEW cov: 12445 ft: 15651 corp: 30/439b lim: 35 exec/s: 36 rss: 73Mb L: 8/33 MS: 1 CrossOver- 00:07:57.296 [2024-10-27 21:32:58.802704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91ff0091 cdw11:6500ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.802728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 [2024-10-27 21:32:58.802795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.802809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.296 [2024-10-27 21:32:58.802861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:91910091 cdw11:93009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.802874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.296 #37 NEW cov: 12445 ft: 15662 corp: 31/460b lim: 35 exec/s: 37 rss: 73Mb L: 21/33 MS: 1 PersAutoDict- DE: "\377\377\000e"- 00:07:57.296 [2024-10-27 21:32:58.842505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.842528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 #38 NEW cov: 12445 ft: 15679 corp: 32/468b lim: 35 exec/s: 38 rss: 73Mb L: 8/33 MS: 1 ShuffleBytes- 00:07:57.296 [2024-10-27 21:32:58.902613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.902637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 [2024-10-27 21:32:58.902686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91910091 cdw11:6f009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.902699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.296 #39 NEW cov: 12445 ft: 15684 corp: 33/485b lim: 35 exec/s: 39 rss: 73Mb L: 17/33 MS: 1 ChangeBinInt- 00:07:57.296 [2024-10-27 21:32:58.942537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.942561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 #40 NEW cov: 12445 ft: 15740 corp: 34/498b lim: 35 exec/s: 40 rss: 73Mb L: 13/33 MS: 1 ChangeByte- 00:07:57.296 [2024-10-27 21:32:58.982469] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.296 [2024-10-27 21:32:58.982696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910034 cdw11:00009140 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.982721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 [2024-10-27 21:32:58.982775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:91000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.296 [2024-10-27 21:32:58.982791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.556 #41 NEW cov: 12445 ft: 15744 corp: 35/513b lim: 35 exec/s: 41 rss: 73Mb L: 15/33 MS: 1 ChangeByte- 00:07:57.556 [2024-10-27 21:32:59.042561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.556 [2024-10-27 21:32:59.042586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.556 #42 NEW cov: 12445 ft: 15746 corp: 36/522b lim: 35 exec/s: 42 rss: 73Mb L: 9/33 MS: 1 ChangeBit- 00:07:57.556 [2024-10-27 21:32:59.102666] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.556 [2024-10-27 21:32:59.102887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:91910091 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.556 [2024-10-27 21:32:59.102912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.556 [2024-10-27 21:32:59.102969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:91400091 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.556 [2024-10-27 21:32:59.102983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.556 [2024-10-27 21:32:59.103033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:91009191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.556 [2024-10-27 21:32:59.103050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.556 #43 NEW cov: 12445 ft: 15762 corp: 37/547b lim: 35 exec/s: 21 rss: 73Mb L: 25/33 MS: 1 PersAutoDict- DE: "@\000\000\000\000\000\000\000"- 00:07:57.556 #43 DONE cov: 12445 ft: 15762 corp: 37/547b lim: 35 exec/s: 21 rss: 73Mb 00:07:57.556 ###### Recommended dictionary. ###### 00:07:57.556 "@\000\000\000\000\000\000\000" # Uses: 2 00:07:57.556 "\377\377\000e" # Uses: 1 00:07:57.556 ###### End of recommended dictionary. ###### 00:07:57.556 Done 43 runs in 2 second(s) 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:57.556 21:32:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:57.556 [2024-10-27 21:32:59.267483] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:07:57.556 [2024-10-27 21:32:59.267569] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3322067 ] 00:07:58.124 [2024-10-27 21:32:59.578753] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:58.124 [2024-10-27 21:32:59.624521] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.124 [2024-10-27 21:32:59.645024] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.124 [2024-10-27 21:32:59.697379] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.124 [2024-10-27 21:32:59.713683] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:58.124 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.124 INFO: Seed: 161880626 00:07:58.124 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:07:58.124 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:07:58.124 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:58.124 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.124 #2 INITED exec/s: 0 rss: 64Mb 00:07:58.124 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.124 This may also happen if the target rejected all inputs we tried so far 00:07:58.383 NEW_FUNC[1/704]: 0x463de8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:58.383 NEW_FUNC[2/704]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.383 #12 NEW cov: 12127 ft: 12124 corp: 2/19b lim: 20 exec/s: 0 rss: 71Mb L: 18/18 MS: 5 CopyPart-InsertByte-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:58.642 #19 NEW cov: 12240 ft: 12795 corp: 3/36b lim: 20 exec/s: 0 rss: 72Mb L: 17/18 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:58.642 #20 NEW cov: 12246 ft: 13073 corp: 4/53b lim: 20 exec/s: 0 rss: 72Mb L: 17/18 MS: 1 ChangeByte- 00:07:58.642 #21 NEW cov: 12331 ft: 13300 corp: 5/72b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 CopyPart- 00:07:58.902 #22 NEW cov: 12331 ft: 13376 corp: 6/90b lim: 20 exec/s: 0 rss: 72Mb L: 18/19 MS: 1 CMP- DE: "\001\006"- 00:07:58.902 [2024-10-27 21:33:00.428428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.902 [2024-10-27 21:33:00.428479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.902 NEW_FUNC[1/17]: 0x13824c8 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3477 00:07:58.902 NEW_FUNC[2/17]: 0x1383048 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3419 00:07:58.902 #27 NEW cov: 12582 ft: 14054 corp: 7/100b lim: 20 exec/s: 0 rss: 72Mb L: 10/19 MS: 5 PersAutoDict-EraseBytes-ChangeByte-ChangeByte-CMP- DE: "\001\006"-"\001\000\000\000\000\000\000\000"- 00:07:58.902 #28 NEW cov: 12582 ft: 14165 corp: 8/119b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:59.160 #29 NEW cov: 12582 ft: 14237 corp: 9/137b lim: 20 exec/s: 0 rss: 72Mb L: 18/19 MS: 1 PersAutoDict- DE: "\001\006"- 00:07:59.160 [2024-10-27 21:33:00.648541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.160 [2024-10-27 21:33:00.648583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.160 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:59.160 #30 NEW cov: 12609 ft: 14400 corp: 10/152b lim: 20 exec/s: 0 rss: 72Mb L: 15/19 MS: 1 InsertRepeatedBytes- 00:07:59.160 #33 NEW cov: 12609 ft: 14760 corp: 11/156b lim: 20 exec/s: 33 rss: 72Mb L: 4/19 MS: 3 PersAutoDict-ChangeByte-InsertByte- DE: "\001\006"- 00:07:59.160 #34 NEW cov: 12609 ft: 14806 corp: 12/175b lim: 20 exec/s: 34 rss: 72Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:59.419 [2024-10-27 21:33:00.888562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.419 [2024-10-27 21:33:00.888599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.419 #35 NEW cov: 12609 ft: 14849 corp: 13/185b lim: 20 exec/s: 35 rss: 72Mb L: 10/19 MS: 1 ChangeByte- 00:07:59.419 #36 NEW cov: 12609 ft: 14935 corp: 14/191b lim: 20 exec/s: 36 rss: 73Mb L: 6/19 MS: 1 CMP- DE: "\377\003"- 00:07:59.419 [2024-10-27 21:33:01.038614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.419 [2024-10-27 21:33:01.038649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.419 #37 NEW cov: 12609 ft: 14950 corp: 15/204b lim: 20 exec/s: 37 rss: 73Mb L: 13/19 MS: 1 EraseBytes- 00:07:59.419 [2024-10-27 21:33:01.128650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.419 [2024-10-27 21:33:01.128681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.678 NEW_FUNC[1/3]: 0x14f8a58 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:784 00:07:59.678 NEW_FUNC[2/3]: 0x151ff68 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3702 00:07:59.678 #38 NEW cov: 12689 ft: 15162 corp: 16/223b lim: 20 exec/s: 38 rss: 73Mb L: 19/19 MS: 1 CopyPart- 00:07:59.678 #39 NEW cov: 12689 ft: 15168 corp: 17/241b lim: 20 exec/s: 39 rss: 73Mb L: 18/19 MS: 1 InsertByte- 00:07:59.678 #40 NEW cov: 12689 ft: 15200 corp: 18/252b lim: 20 exec/s: 40 rss: 73Mb L: 11/19 MS: 1 EraseBytes- 00:07:59.678 #41 NEW cov: 12689 ft: 15229 corp: 19/263b lim: 20 exec/s: 41 rss: 73Mb L: 11/19 MS: 1 ChangeByte- 00:07:59.937 #42 NEW cov: 12689 ft: 15250 corp: 20/277b lim: 20 exec/s: 42 rss: 73Mb L: 14/19 MS: 1 CrossOver- 00:07:59.937 #43 NEW cov: 12689 ft: 15282 corp: 21/296b lim: 20 exec/s: 43 rss: 73Mb L: 19/19 MS: 1 ChangeBinInt- 00:07:59.937 #44 NEW cov: 12689 ft: 15308 corp: 22/313b lim: 20 exec/s: 44 rss: 73Mb L: 17/19 MS: 1 ShuffleBytes- 00:07:59.937 #45 NEW cov: 12689 ft: 15337 corp: 23/331b lim: 20 exec/s: 45 rss: 73Mb L: 18/19 MS: 1 ShuffleBytes- 00:07:59.937 #46 NEW cov: 12689 ft: 15346 corp: 24/349b lim: 20 exec/s: 46 rss: 73Mb L: 18/19 MS: 1 ChangeBit- 00:07:59.937 [2024-10-27 21:33:01.648685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.937 [2024-10-27 21:33:01.648717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.197 #47 NEW cov: 12689 ft: 15380 corp: 25/360b lim: 20 exec/s: 47 rss: 73Mb L: 11/19 MS: 1 InsertByte- 00:08:00.197 #48 NEW cov: 12689 ft: 15389 corp: 26/379b lim: 20 exec/s: 24 rss: 73Mb L: 19/19 MS: 1 InsertByte- 00:08:00.197 #48 DONE cov: 12689 ft: 15389 corp: 26/379b lim: 20 exec/s: 24 rss: 73Mb 00:08:00.197 ###### Recommended dictionary. ###### 00:08:00.197 "\001\006" # Uses: 3 00:08:00.197 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:00.197 "\377\003" # Uses: 0 00:08:00.197 ###### End of recommended dictionary. ###### 00:08:00.197 Done 48 runs in 2 second(s) 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:00.197 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.456 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.456 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.456 21:33:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:00.456 [2024-10-27 21:33:01.954224] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:00.456 [2024-10-27 21:33:01.954299] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3322653 ] 00:08:00.715 [2024-10-27 21:33:02.275924] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:00.715 [2024-10-27 21:33:02.322047] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.715 [2024-10-27 21:33:02.340397] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.715 [2024-10-27 21:33:02.392950] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.715 [2024-10-27 21:33:02.409254] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:00.715 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.715 INFO: Seed: 2855890169 00:08:00.974 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:00.974 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:00.974 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:00.974 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.974 #2 INITED exec/s: 0 rss: 64Mb 00:08:00.974 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.974 This may also happen if the target rejected all inputs we tried so far 00:08:00.974 [2024-10-27 21:33:02.479114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:60812eff cdw11:06ef0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.974 [2024-10-27 21:33:02.479151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.233 NEW_FUNC[1/716]: 0x464ee8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:01.233 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.233 #30 NEW cov: 12228 ft: 12201 corp: 2/10b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 ShuffleBytes-ChangeByte-CMP- DE: ".\377`\201\006\357z\000"- 00:08:01.233 [2024-10-27 21:33:02.818670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.233 [2024-10-27 21:33:02.818707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.233 [2024-10-27 21:33:02.818829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.233 [2024-10-27 21:33:02.818847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.233 #35 NEW cov: 12341 ft: 13680 corp: 3/27b lim: 35 exec/s: 0 rss: 72Mb L: 17/17 MS: 5 ChangeBit-ChangeBit-ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:01.233 [2024-10-27 21:33:02.878602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.233 [2024-10-27 21:33:02.878631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.233 [2024-10-27 21:33:02.878754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.233 [2024-10-27 21:33:02.878771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.233 #41 NEW cov: 12347 ft: 13819 corp: 4/44b lim: 35 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 ShuffleBytes- 00:08:01.233 [2024-10-27 21:33:02.938412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:60812eff cdw11:06ef0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.233 [2024-10-27 21:33:02.938440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.492 #42 NEW cov: 12432 ft: 14095 corp: 5/53b lim: 35 exec/s: 0 rss: 72Mb L: 9/17 MS: 1 ChangeBit- 00:08:01.492 [2024-10-27 21:33:02.998486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff602e2e cdw11:81060003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:02.998513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.492 #43 NEW cov: 12432 ft: 14245 corp: 6/62b lim: 35 exec/s: 0 rss: 72Mb L: 9/17 MS: 1 PersAutoDict- DE: ".\377`\201\006\357z\000"- 00:08:01.492 [2024-10-27 21:33:03.048725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.048753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.492 [2024-10-27 21:33:03.048864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.048881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.492 #44 NEW cov: 12432 ft: 14344 corp: 7/79b lim: 35 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 ShuffleBytes- 00:08:01.492 [2024-10-27 21:33:03.119061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff2effff cdw11:ff600001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.119089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.492 [2024-10-27 21:33:03.119214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7a0006ef cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.119234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.492 [2024-10-27 21:33:03.119346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.119363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.492 #45 NEW cov: 12432 ft: 14608 corp: 8/104b lim: 35 exec/s: 0 rss: 73Mb L: 25/25 MS: 1 PersAutoDict- DE: ".\377`\201\006\357z\000"- 00:08:01.492 [2024-10-27 21:33:03.188764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.188793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.492 [2024-10-27 21:33:03.188920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.492 [2024-10-27 21:33:03.188938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.492 #46 NEW cov: 12432 ft: 14652 corp: 9/121b lim: 35 exec/s: 0 rss: 73Mb L: 17/25 MS: 1 ShuffleBytes- 00:08:01.751 [2024-10-27 21:33:03.238797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.751 [2024-10-27 21:33:03.238827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.751 [2024-10-27 21:33:03.238954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0100ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.751 [2024-10-27 21:33:03.238972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.751 #47 NEW cov: 12432 ft: 14720 corp: 10/138b lim: 35 exec/s: 0 rss: 73Mb L: 17/25 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:01.751 [2024-10-27 21:33:03.309457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff602e2e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.751 [2024-10-27 21:33:03.309487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.751 [2024-10-27 21:33:03.309605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.751 [2024-10-27 21:33:03.309622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.751 [2024-10-27 21:33:03.309740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.751 [2024-10-27 21:33:03.309757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.751 [2024-10-27 21:33:03.309869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:81060000 cdw11:ef7a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.751 [2024-10-27 21:33:03.309887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.751 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:01.751 #48 NEW cov: 12455 ft: 15102 corp: 11/166b lim: 35 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:01.752 [2024-10-27 21:33:03.379270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:60812eff cdw11:06ef0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.752 [2024-10-27 21:33:03.379299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.752 [2024-10-27 21:33:03.379418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:41410041 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.752 [2024-10-27 21:33:03.379435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.752 [2024-10-27 21:33:03.379540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.752 [2024-10-27 21:33:03.379557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.752 #49 NEW cov: 12455 ft: 15141 corp: 12/191b lim: 35 exec/s: 0 rss: 73Mb L: 25/28 MS: 1 InsertRepeatedBytes- 00:08:01.752 [2024-10-27 21:33:03.428990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.752 [2024-10-27 21:33:03.429017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.752 [2024-10-27 21:33:03.429128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:d9ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.752 [2024-10-27 21:33:03.429145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.752 #50 NEW cov: 12455 ft: 15160 corp: 13/208b lim: 35 exec/s: 50 rss: 73Mb L: 17/28 MS: 1 ChangeByte- 00:08:02.010 [2024-10-27 21:33:03.479073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.479100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.010 [2024-10-27 21:33:03.479212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:01007fff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.479228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.010 #51 NEW cov: 12455 ft: 15211 corp: 14/225b lim: 35 exec/s: 51 rss: 73Mb L: 17/28 MS: 1 ChangeBit- 00:08:02.010 [2024-10-27 21:33:03.549382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.549410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.010 [2024-10-27 21:33:03.549546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.549564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.010 [2024-10-27 21:33:03.549681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:4848ffff cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.549700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.010 #52 NEW cov: 12455 ft: 15247 corp: 15/248b lim: 35 exec/s: 52 rss: 73Mb L: 23/28 MS: 1 InsertRepeatedBytes- 00:08:02.010 [2024-10-27 21:33:03.599182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.599210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.010 [2024-10-27 21:33:03.599329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.599348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.010 #53 NEW cov: 12455 ft: 15258 corp: 16/266b lim: 35 exec/s: 53 rss: 73Mb L: 18/28 MS: 1 CopyPart- 00:08:02.010 [2024-10-27 21:33:03.649577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff2effff cdw11:ff600001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.649608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.010 [2024-10-27 21:33:03.649732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7a0006ef cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.649751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.010 [2024-10-27 21:33:03.649898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:30ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.649916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.010 #54 NEW cov: 12455 ft: 15280 corp: 17/292b lim: 35 exec/s: 54 rss: 73Mb L: 26/28 MS: 1 InsertByte- 00:08:02.010 [2024-10-27 21:33:03.718966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.010 [2024-10-27 21:33:03.718996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.268 #55 NEW cov: 12455 ft: 15319 corp: 18/305b lim: 35 exec/s: 55 rss: 73Mb L: 13/28 MS: 1 EraseBytes- 00:08:02.269 [2024-10-27 21:33:03.769286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.769315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.269 [2024-10-27 21:33:03.769425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.769443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.269 #56 NEW cov: 12455 ft: 15363 corp: 19/322b lim: 35 exec/s: 56 rss: 73Mb L: 17/28 MS: 1 ShuffleBytes- 00:08:02.269 [2024-10-27 21:33:03.819634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.819665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.269 [2024-10-27 21:33:03.819786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.819803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.269 [2024-10-27 21:33:03.819919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:4848ffff cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.819937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.269 #57 NEW cov: 12455 ft: 15381 corp: 20/345b lim: 35 exec/s: 57 rss: 73Mb L: 23/28 MS: 1 ChangeByte- 00:08:02.269 [2024-10-27 21:33:03.889392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.889420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.269 [2024-10-27 21:33:03.889542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffff6ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.889560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.269 #58 NEW cov: 12455 ft: 15387 corp: 21/362b lim: 35 exec/s: 58 rss: 73Mb L: 17/28 MS: 1 ChangeBinInt- 00:08:02.269 [2024-10-27 21:33:03.959345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.959376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.269 [2024-10-27 21:33:03.959489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.269 [2024-10-27 21:33:03.959508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.269 #59 NEW cov: 12455 ft: 15413 corp: 22/379b lim: 35 exec/s: 59 rss: 73Mb L: 17/28 MS: 1 ShuffleBytes- 00:08:02.527 [2024-10-27 21:33:04.009095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff602e2e cdw11:81060003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.527 [2024-10-27 21:33:04.009123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.528 #60 NEW cov: 12455 ft: 15447 corp: 23/388b lim: 35 exec/s: 60 rss: 73Mb L: 9/28 MS: 1 CopyPart- 00:08:02.528 [2024-10-27 21:33:04.059955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:6e6effff cdw11:6e6e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.059979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.060000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6e6e6e6e cdw11:6e6e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.060011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.060029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:6eff6e6e cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.060040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.060159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.060176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.528 #61 NEW cov: 12455 ft: 15452 corp: 24/420b lim: 35 exec/s: 61 rss: 73Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:02.528 [2024-10-27 21:33:04.109964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff2effff cdw11:ff600001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.109993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.110105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff06ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.110121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.110233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ffef7a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.110253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.110365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.110383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.528 #62 NEW cov: 12455 ft: 15494 corp: 25/451b lim: 35 exec/s: 62 rss: 73Mb L: 31/32 MS: 1 CopyPart- 00:08:02.528 [2024-10-27 21:33:04.159746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.159776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.159889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.159906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.160029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:4848ffff cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.160045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.528 #63 NEW cov: 12455 ft: 15495 corp: 26/474b lim: 35 exec/s: 63 rss: 73Mb L: 23/32 MS: 1 CopyPart- 00:08:02.528 [2024-10-27 21:33:04.219564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.219591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.528 [2024-10-27 21:33:04.219705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.528 [2024-10-27 21:33:04.219734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.528 #64 NEW cov: 12455 ft: 15500 corp: 27/491b lim: 35 exec/s: 64 rss: 74Mb L: 17/32 MS: 1 ChangeBinInt- 00:08:02.787 [2024-10-27 21:33:04.259553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.259582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.787 [2024-10-27 21:33:04.259710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.259728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.787 #65 NEW cov: 12455 ft: 15508 corp: 28/508b lim: 35 exec/s: 65 rss: 74Mb L: 17/32 MS: 1 ChangeBit- 00:08:02.787 [2024-10-27 21:33:04.309545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.309574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.787 [2024-10-27 21:33:04.309692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.309709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.787 #66 NEW cov: 12455 ft: 15514 corp: 29/526b lim: 35 exec/s: 66 rss: 74Mb L: 18/32 MS: 1 InsertByte- 00:08:02.787 [2024-10-27 21:33:04.349260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1e98820d cdw11:07ef0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.349289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.787 #67 NEW cov: 12455 ft: 15612 corp: 30/535b lim: 35 exec/s: 67 rss: 74Mb L: 9/32 MS: 1 CMP- DE: "\202\015\036\230\007\357z\000"- 00:08:02.787 [2024-10-27 21:33:04.399381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7aef6000 cdw11:082e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.399410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.787 #72 NEW cov: 12455 ft: 15618 corp: 31/548b lim: 35 exec/s: 72 rss: 74Mb L: 13/32 MS: 5 EraseBytes-ChangeBinInt-EraseBytes-ChangeBit-CMP- DE: "\000z\357\010.I\345\332"- 00:08:02.787 [2024-10-27 21:33:04.449691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffdfff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.449721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.787 [2024-10-27 21:33:04.449837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:fffff6ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.787 [2024-10-27 21:33:04.449855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.787 #73 NEW cov: 12455 ft: 15637 corp: 32/565b lim: 35 exec/s: 36 rss: 74Mb L: 17/32 MS: 1 ChangeBit- 00:08:02.787 #73 DONE cov: 12455 ft: 15637 corp: 32/565b lim: 35 exec/s: 36 rss: 74Mb 00:08:02.787 ###### Recommended dictionary. ###### 00:08:02.787 ".\377`\201\006\357z\000" # Uses: 2 00:08:02.787 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:02.787 "\202\015\036\230\007\357z\000" # Uses: 0 00:08:02.787 "\000z\357\010.I\345\332" # Uses: 0 00:08:02.787 ###### End of recommended dictionary. ###### 00:08:02.787 Done 73 runs in 2 second(s) 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.046 21:33:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:03.046 [2024-10-27 21:33:04.636749] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:03.046 [2024-10-27 21:33:04.636816] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3323426 ] 00:08:03.304 [2024-10-27 21:33:04.953112] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:03.304 [2024-10-27 21:33:05.000383] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.304 [2024-10-27 21:33:05.018552] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.563 [2024-10-27 21:33:05.071000] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.563 [2024-10-27 21:33:05.087292] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:03.563 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.563 INFO: Seed: 1238927724 00:08:03.563 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:03.563 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:03.563 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.563 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.563 #2 INITED exec/s: 0 rss: 65Mb 00:08:03.563 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.563 This may also happen if the target rejected all inputs we tried so far 00:08:03.563 [2024-10-27 21:33:05.159233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.563 [2024-10-27 21:33:05.159274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.563 [2024-10-27 21:33:05.159344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.563 [2024-10-27 21:33:05.159360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.563 [2024-10-27 21:33:05.159425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.563 [2024-10-27 21:33:05.159442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.563 [2024-10-27 21:33:05.159510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.563 [2024-10-27 21:33:05.159527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.822 NEW_FUNC[1/716]: 0x467088 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:03.822 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.822 #20 NEW cov: 12239 ft: 12238 corp: 2/43b lim: 45 exec/s: 0 rss: 72Mb L: 42/42 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:03.822 [2024-10-27 21:33:05.508662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-10-27 21:33:05.508706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 [2024-10-27 21:33:05.508838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-10-27 21:33:05.508858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.822 [2024-10-27 21:33:05.508992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-10-27 21:33:05.509010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.822 [2024-10-27 21:33:05.509143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-10-27 21:33:05.509161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.081 #21 NEW cov: 12352 ft: 12832 corp: 3/86b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 InsertByte- 00:08:04.081 [2024-10-27 21:33:05.578622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.578655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.578778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.578795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.578917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.578935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.579063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.579080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.081 #22 NEW cov: 12358 ft: 13104 corp: 4/128b lim: 45 exec/s: 0 rss: 72Mb L: 42/43 MS: 1 ChangeBit- 00:08:04.081 [2024-10-27 21:33:05.628619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a0a cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.628650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.628772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.628789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.628915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.628933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.629059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.629076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.081 #23 NEW cov: 12443 ft: 13381 corp: 5/164b lim: 45 exec/s: 0 rss: 72Mb L: 36/43 MS: 1 CrossOver- 00:08:04.081 [2024-10-27 21:33:05.678551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a0a cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.678581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.678715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.678735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.678862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.678879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.679007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9e9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.679024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.081 #24 NEW cov: 12443 ft: 13460 corp: 6/207b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 CrossOver- 00:08:04.081 [2024-10-27 21:33:05.748715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.748745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.748872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.748891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.749007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.749026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.081 [2024-10-27 21:33:05.749139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:63636b63 cdw11:63630003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.081 [2024-10-27 21:33:05.749157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.081 #25 NEW cov: 12443 ft: 13669 corp: 7/250b lim: 45 exec/s: 0 rss: 73Mb L: 43/43 MS: 1 ChangeBinInt- 00:08:04.340 [2024-10-27 21:33:05.818728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a0a cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.818757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.340 [2024-10-27 21:33:05.818882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.818911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.340 [2024-10-27 21:33:05.819060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.819078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.340 [2024-10-27 21:33:05.819206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.819228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.340 #26 NEW cov: 12443 ft: 13720 corp: 8/286b lim: 45 exec/s: 0 rss: 73Mb L: 36/43 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:04.340 [2024-10-27 21:33:05.868700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.868728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.340 [2024-10-27 21:33:05.868843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.868861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.340 [2024-10-27 21:33:05.868989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.869007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.340 [2024-10-27 21:33:05.869129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.340 [2024-10-27 21:33:05.869148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.340 #27 NEW cov: 12443 ft: 13761 corp: 9/329b lim: 45 exec/s: 0 rss: 73Mb L: 43/43 MS: 1 CopyPart- 00:08:04.341 [2024-10-27 21:33:05.918189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:05.918218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:05.918345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:05.918362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.341 #28 NEW cov: 12443 ft: 14187 corp: 10/351b lim: 45 exec/s: 0 rss: 73Mb L: 22/43 MS: 1 EraseBytes- 00:08:04.341 [2024-10-27 21:33:05.988805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a0a cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:05.988834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:05.988966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:05.988983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:05.989098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:05.989115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:05.989231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9e9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:05.989248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.341 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:04.341 #29 NEW cov: 12466 ft: 14212 corp: 11/394b lim: 45 exec/s: 0 rss: 73Mb L: 43/43 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:04.341 [2024-10-27 21:33:06.058881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:06.058910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:06.059048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:06.059066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:06.059188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:06.059205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.341 [2024-10-27 21:33:06.059326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff9c9c cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-10-27 21:33:06.059342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.600 #35 NEW cov: 12466 ft: 14243 corp: 12/436b lim: 45 exec/s: 0 rss: 73Mb L: 42/43 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\005"- 00:08:04.600 [2024-10-27 21:33:06.108333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.108362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.108493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.108513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.600 #36 NEW cov: 12466 ft: 14305 corp: 13/458b lim: 45 exec/s: 36 rss: 73Mb L: 22/43 MS: 1 EraseBytes- 00:08:04.600 [2024-10-27 21:33:06.158909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c020a9c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.158938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.159062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.159079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.159201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.159217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.159340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.159356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.600 #37 NEW cov: 12466 ft: 14371 corp: 14/501b lim: 45 exec/s: 37 rss: 73Mb L: 43/43 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:04.600 [2024-10-27 21:33:06.208636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:48480a48 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.208668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.208782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.208800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.208918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.208936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.600 #38 NEW cov: 12466 ft: 14601 corp: 15/530b lim: 45 exec/s: 38 rss: 73Mb L: 29/43 MS: 1 InsertRepeatedBytes- 00:08:04.600 [2024-10-27 21:33:06.259053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.259080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.259203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.259221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.259340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c24 cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.259359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.600 [2024-10-27 21:33:06.259477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.600 [2024-10-27 21:33:06.259493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.600 #39 NEW cov: 12466 ft: 14607 corp: 16/573b lim: 45 exec/s: 39 rss: 73Mb L: 43/43 MS: 1 InsertByte- 00:08:04.859 [2024-10-27 21:33:06.329050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c020a9c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.329079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.329198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.329216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.329348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.329366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.329492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00009c02 cdw11:009c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.329508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.859 #40 NEW cov: 12466 ft: 14641 corp: 17/616b lim: 45 exec/s: 40 rss: 73Mb L: 43/43 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:04.859 [2024-10-27 21:33:06.399035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.399069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.399187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.399213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.399332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c24 cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.399350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.399478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.399494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.859 #41 NEW cov: 12466 ft: 14645 corp: 18/659b lim: 45 exec/s: 41 rss: 73Mb L: 43/43 MS: 1 CopyPart- 00:08:04.859 [2024-10-27 21:33:06.469054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.469081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.469200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.469218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.469337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.469354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.469482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.469498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.859 #42 NEW cov: 12466 ft: 14663 corp: 19/702b lim: 45 exec/s: 42 rss: 73Mb L: 43/43 MS: 1 InsertByte- 00:08:04.859 [2024-10-27 21:33:06.518764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:48480a48 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.518793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.518917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.518933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.859 [2024-10-27 21:33:06.519062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.859 [2024-10-27 21:33:06.519080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.860 #43 NEW cov: 12466 ft: 14712 corp: 20/731b lim: 45 exec/s: 43 rss: 73Mb L: 29/43 MS: 1 ChangeByte- 00:08:05.118 [2024-10-27 21:33:06.589216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.118 [2024-10-27 21:33:06.589252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.118 [2024-10-27 21:33:06.589373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.118 [2024-10-27 21:33:06.589391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.118 [2024-10-27 21:33:06.589504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.118 [2024-10-27 21:33:06.589525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.118 [2024-10-27 21:33:06.589644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.118 [2024-10-27 21:33:06.589662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.118 #44 NEW cov: 12466 ft: 14759 corp: 21/767b lim: 45 exec/s: 44 rss: 74Mb L: 36/43 MS: 1 EraseBytes- 00:08:05.118 [2024-10-27 21:33:06.659187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c2a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.118 [2024-10-27 21:33:06.659217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.118 [2024-10-27 21:33:06.659342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.118 [2024-10-27 21:33:06.659361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.659485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.659504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.659626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.659643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.119 #45 NEW cov: 12466 ft: 14812 corp: 22/809b lim: 45 exec/s: 45 rss: 74Mb L: 42/43 MS: 1 ChangeBit- 00:08:05.119 [2024-10-27 21:33:06.709297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.709324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.709442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:4848000a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.709459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.709579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.709597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.709711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:48483a48 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.709727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.119 #46 NEW cov: 12466 ft: 14866 corp: 23/848b lim: 45 exec/s: 46 rss: 74Mb L: 39/43 MS: 1 InsertRepeatedBytes- 00:08:05.119 [2024-10-27 21:33:06.779303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:409c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.779332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.779454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.779473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.779593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.779612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.119 [2024-10-27 21:33:06.779732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.119 [2024-10-27 21:33:06.779751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.119 #47 NEW cov: 12466 ft: 14876 corp: 24/885b lim: 45 exec/s: 47 rss: 74Mb L: 37/43 MS: 1 InsertByte- 00:08:05.378 [2024-10-27 21:33:06.849358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.849390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.849513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.849530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.849637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.849655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.849778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff9c9c cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.849798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.378 #48 NEW cov: 12466 ft: 14897 corp: 25/927b lim: 45 exec/s: 48 rss: 74Mb L: 42/43 MS: 1 ChangeBit- 00:08:05.378 [2024-10-27 21:33:06.919423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c020a9c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.919455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.919583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.919603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.919724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.919743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.919871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.919890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.378 #49 NEW cov: 12466 ft: 14917 corp: 26/970b lim: 45 exec/s: 49 rss: 74Mb L: 43/43 MS: 1 ShuffleBytes- 00:08:05.378 [2024-10-27 21:33:06.969434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.969464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.969600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.969618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.969741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.969763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:06.969890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:06.969907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.378 #50 NEW cov: 12466 ft: 14965 corp: 27/1013b lim: 45 exec/s: 50 rss: 74Mb L: 43/43 MS: 1 ChangeBit- 00:08:05.378 [2024-10-27 21:33:07.019515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c0a0a cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:07.019546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:07.019668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:07.019687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.378 [2024-10-27 21:33:07.019811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:02009c9c cdw11:00000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.378 [2024-10-27 21:33:07.019830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.379 [2024-10-27 21:33:07.019958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.379 [2024-10-27 21:33:07.019977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.379 #51 NEW cov: 12466 ft: 14996 corp: 28/1053b lim: 45 exec/s: 51 rss: 74Mb L: 40/43 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:05.379 [2024-10-27 21:33:07.089514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9c9c2a9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.379 [2024-10-27 21:33:07.089543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.379 [2024-10-27 21:33:07.089665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.379 [2024-10-27 21:33:07.089689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.379 [2024-10-27 21:33:07.089804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.379 [2024-10-27 21:33:07.089822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.379 [2024-10-27 21:33:07.089946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:9c9c9c9c cdw11:9c9c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.379 [2024-10-27 21:33:07.089965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.638 #52 NEW cov: 12466 ft: 15009 corp: 29/1095b lim: 45 exec/s: 26 rss: 74Mb L: 42/43 MS: 1 CopyPart- 00:08:05.638 #52 DONE cov: 12466 ft: 15009 corp: 29/1095b lim: 45 exec/s: 26 rss: 74Mb 00:08:05.638 ###### Recommended dictionary. ###### 00:08:05.638 "\002\000\000\000" # Uses: 5 00:08:05.638 "\377\377\377\377\377\377\377\005" # Uses: 0 00:08:05.638 ###### End of recommended dictionary. ###### 00:08:05.638 Done 52 runs in 2 second(s) 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.638 21:33:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:05.638 [2024-10-27 21:33:07.275629] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:05.638 [2024-10-27 21:33:07.275693] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3324154 ] 00:08:05.897 [2024-10-27 21:33:07.587854] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:06.156 [2024-10-27 21:33:07.634318] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.156 [2024-10-27 21:33:07.655644] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.156 [2024-10-27 21:33:07.707972] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.156 [2024-10-27 21:33:07.724289] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:06.156 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.156 INFO: Seed: 3876912247 00:08:06.156 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:06.156 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:06.156 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.156 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.156 #2 INITED exec/s: 0 rss: 64Mb 00:08:06.156 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.156 This may also happen if the target rejected all inputs we tried so far 00:08:06.156 [2024-10-27 21:33:07.769910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.156 [2024-10-27 21:33:07.769938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.156 [2024-10-27 21:33:07.770013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.156 [2024-10-27 21:33:07.770027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.156 [2024-10-27 21:33:07.770080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.156 [2024-10-27 21:33:07.770093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.156 [2024-10-27 21:33:07.770149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.156 [2024-10-27 21:33:07.770162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.414 NEW_FUNC[1/714]: 0x469898 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:06.414 NEW_FUNC[2/714]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.414 #5 NEW cov: 12156 ft: 12153 corp: 2/10b lim: 10 exec/s: 0 rss: 71Mb L: 9/9 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:06.414 [2024-10-27 21:33:08.099859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.414 [2024-10-27 21:33:08.099890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.414 [2024-10-27 21:33:08.099958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:08:06.414 [2024-10-27 21:33:08.099972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.415 [2024-10-27 21:33:08.100023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.415 [2024-10-27 21:33:08.100036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.415 [2024-10-27 21:33:08.100087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 00:08:06.415 [2024-10-27 21:33:08.100100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.673 #6 NEW cov: 12269 ft: 12752 corp: 3/19b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:06.673 [2024-10-27 21:33:08.159812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.673 [2024-10-27 21:33:08.159841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.673 [2024-10-27 21:33:08.159890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.673 [2024-10-27 21:33:08.159903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.673 [2024-10-27 21:33:08.159954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005100 cdw11:00000000 00:08:06.673 [2024-10-27 21:33:08.159968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.673 [2024-10-27 21:33:08.160014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.673 [2024-10-27 21:33:08.160026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.673 #7 NEW cov: 12275 ft: 13031 corp: 4/28b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:08:06.673 [2024-10-27 21:33:08.199483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:06.673 [2024-10-27 21:33:08.199510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.673 #8 NEW cov: 12360 ft: 13604 corp: 5/30b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 1 CopyPart- 00:08:06.673 [2024-10-27 21:33:08.239856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.673 [2024-10-27 21:33:08.239882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.673 [2024-10-27 21:33:08.239934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.239952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.240004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.240017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.240067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.240079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.674 #9 NEW cov: 12360 ft: 13749 corp: 6/39b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:08:06.674 [2024-10-27 21:33:08.279869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.279895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.279949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.279962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.280015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.280028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.280078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000400 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.280091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.674 #10 NEW cov: 12360 ft: 13779 corp: 7/48b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\001\000\000\000\000\000\004\000"- 00:08:06.674 [2024-10-27 21:33:08.319526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.319551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.674 #11 NEW cov: 12360 ft: 13924 corp: 8/50b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 1 CopyPart- 00:08:06.674 [2024-10-27 21:33:08.359914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.359939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.359999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.360013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.360063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.360093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.674 [2024-10-27 21:33:08.360145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 00:08:06.674 [2024-10-27 21:33:08.360157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.932 #12 NEW cov: 12360 ft: 13963 corp: 9/59b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:06.932 [2024-10-27 21:33:08.419578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a69 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.419603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.932 #14 NEW cov: 12360 ft: 13996 corp: 10/61b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 2 ShuffleBytes-InsertByte- 00:08:06.932 [2024-10-27 21:33:08.459949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.459975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.460025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.460038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.460087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005100 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.460100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.460147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000900 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.460159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.932 #15 NEW cov: 12360 ft: 14085 corp: 11/70b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:06.932 [2024-10-27 21:33:08.519965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.519990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.520041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.520061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.520112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.520125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.520175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000029 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.520187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.932 #16 NEW cov: 12360 ft: 14108 corp: 12/78b lim: 10 exec/s: 0 rss: 72Mb L: 8/9 MS: 1 EraseBytes- 00:08:06.932 [2024-10-27 21:33:08.580008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.580033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.580085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.580098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.580148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.580162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.580212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000429 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.580225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.932 #17 NEW cov: 12360 ft: 14143 corp: 13/86b lim: 10 exec/s: 0 rss: 72Mb L: 8/9 MS: 1 ShuffleBytes- 00:08:06.932 [2024-10-27 21:33:08.640051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.640077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.640130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.640143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.640190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000500 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.640203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.932 [2024-10-27 21:33:08.640269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000429 cdw11:00000000 00:08:06.932 [2024-10-27 21:33:08.640282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.190 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:07.190 #18 NEW cov: 12383 ft: 14176 corp: 14/94b lim: 10 exec/s: 0 rss: 72Mb L: 8/9 MS: 1 ChangeBinInt- 00:08:07.190 [2024-10-27 21:33:08.700073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:07.190 [2024-10-27 21:33:08.700097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.190 [2024-10-27 21:33:08.700148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.190 [2024-10-27 21:33:08.700164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.190 [2024-10-27 21:33:08.700214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.700227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.700276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.700288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.191 #19 NEW cov: 12383 ft: 14177 corp: 15/103b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:08:07.191 [2024-10-27 21:33:08.740066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.740091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.740143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.740156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.740205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.740218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.740269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000029 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.740281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.191 #20 NEW cov: 12383 ft: 14212 corp: 16/111b lim: 10 exec/s: 20 rss: 72Mb L: 8/9 MS: 1 ShuffleBytes- 00:08:07.191 [2024-10-27 21:33:08.780075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.780100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.780150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000009c cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.780163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.780213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.780226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.780275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.780288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.191 #21 NEW cov: 12383 ft: 14237 corp: 17/120b lim: 10 exec/s: 21 rss: 72Mb L: 9/9 MS: 1 InsertByte- 00:08:07.191 [2024-10-27 21:33:08.820093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.820118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.820171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.820187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.820237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.820250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.820300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000429 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.820312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.191 #22 NEW cov: 12383 ft: 14254 corp: 18/128b lim: 10 exec/s: 22 rss: 72Mb L: 8/9 MS: 1 ChangeByte- 00:08:07.191 [2024-10-27 21:33:08.860034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.860058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.860110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.860123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.191 [2024-10-27 21:33:08.860174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.860187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.191 #23 NEW cov: 12383 ft: 14423 corp: 19/135b lim: 10 exec/s: 23 rss: 72Mb L: 7/9 MS: 1 EraseBytes- 00:08:07.191 [2024-10-27 21:33:08.899802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000abe cdw11:00000000 00:08:07.191 [2024-10-27 21:33:08.899826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.449 #24 NEW cov: 12383 ft: 14465 corp: 20/137b lim: 10 exec/s: 24 rss: 73Mb L: 2/9 MS: 1 ChangeByte- 00:08:07.449 [2024-10-27 21:33:08.960168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.449 [2024-10-27 21:33:08.960193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.449 [2024-10-27 21:33:08.960243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.449 [2024-10-27 21:33:08.960256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.449 [2024-10-27 21:33:08.960305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.449 [2024-10-27 21:33:08.960318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.449 [2024-10-27 21:33:08.960366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.449 [2024-10-27 21:33:08.960378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.449 #25 NEW cov: 12383 ft: 14493 corp: 21/146b lim: 10 exec/s: 25 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:08:07.450 [2024-10-27 21:33:09.000084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.000110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.000160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.000177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.000228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000600 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.000241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.450 #26 NEW cov: 12383 ft: 14514 corp: 22/153b lim: 10 exec/s: 26 rss: 73Mb L: 7/9 MS: 1 ChangeBit- 00:08:07.450 [2024-10-27 21:33:09.060252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.060277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.060328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.060342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.060391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.060404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.060455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000429 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.060468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.450 #27 NEW cov: 12383 ft: 14533 corp: 23/161b lim: 10 exec/s: 27 rss: 73Mb L: 8/9 MS: 1 ChangeBit- 00:08:07.450 [2024-10-27 21:33:09.120262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000002f cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.120287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.120336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000101 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.120349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.120399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.120412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.120461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.120473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.450 #28 NEW cov: 12383 ft: 14540 corp: 24/170b lim: 10 exec/s: 28 rss: 73Mb L: 9/9 MS: 1 CMP- DE: "/\001"- 00:08:07.450 [2024-10-27 21:33:09.160405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.160429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.160479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.160492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.160540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.160553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.160622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.160635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.450 [2024-10-27 21:33:09.160684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000229 cdw11:00000000 00:08:07.450 [2024-10-27 21:33:09.160696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.708 #29 NEW cov: 12383 ft: 14588 corp: 25/180b lim: 10 exec/s: 29 rss: 73Mb L: 10/10 MS: 1 CopyPart- 00:08:07.708 [2024-10-27 21:33:09.220320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.220344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.708 [2024-10-27 21:33:09.220394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000051 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.220408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.708 [2024-10-27 21:33:09.220459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.220472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.708 [2024-10-27 21:33:09.220522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000029 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.220534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.708 #30 NEW cov: 12383 ft: 14613 corp: 26/188b lim: 10 exec/s: 30 rss: 73Mb L: 8/10 MS: 1 EraseBytes- 00:08:07.708 [2024-10-27 21:33:09.260321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009000 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.260345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.708 [2024-10-27 21:33:09.260393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.260407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.708 [2024-10-27 21:33:09.260457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.708 [2024-10-27 21:33:09.260469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.260516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000046c cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.260529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.709 #31 NEW cov: 12383 ft: 14647 corp: 27/196b lim: 10 exec/s: 31 rss: 73Mb L: 8/10 MS: 1 ChangeByte- 00:08:07.709 [2024-10-27 21:33:09.300095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.300120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.300172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.300185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.709 #32 NEW cov: 12383 ft: 14857 corp: 28/201b lim: 10 exec/s: 32 rss: 73Mb L: 5/10 MS: 1 EraseBytes- 00:08:07.709 [2024-10-27 21:33:09.340336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.340360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.340413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.340426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.340475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002500 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.340488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.340538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.340551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.709 #33 NEW cov: 12383 ft: 14877 corp: 29/210b lim: 10 exec/s: 33 rss: 73Mb L: 9/10 MS: 1 ChangeByte- 00:08:07.709 [2024-10-27 21:33:09.400248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.400273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.400327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.400339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.709 [2024-10-27 21:33:09.400389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:00000000 00:08:07.709 [2024-10-27 21:33:09.400402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.709 #34 NEW cov: 12383 ft: 14892 corp: 30/217b lim: 10 exec/s: 34 rss: 73Mb L: 7/10 MS: 1 ShuffleBytes- 00:08:07.967 [2024-10-27 21:33:09.440223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 00:08:07.967 [2024-10-27 21:33:09.440248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.967 [2024-10-27 21:33:09.440299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:08:07.967 [2024-10-27 21:33:09.440313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.967 [2024-10-27 21:33:09.440366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.440395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.968 #35 NEW cov: 12383 ft: 14908 corp: 31/224b lim: 10 exec/s: 35 rss: 73Mb L: 7/10 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:07.968 [2024-10-27 21:33:09.480529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.480554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.480604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.480617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.480672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.480686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.480737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.480750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.480799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000229 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.480811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.968 #36 NEW cov: 12383 ft: 14921 corp: 32/234b lim: 10 exec/s: 36 rss: 73Mb L: 10/10 MS: 1 ChangeByte- 00:08:07.968 [2024-10-27 21:33:09.540464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.540489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.540542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.540555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.540605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.540618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.540669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.540681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.968 #37 NEW cov: 12383 ft: 14935 corp: 33/242b lim: 10 exec/s: 37 rss: 73Mb L: 8/10 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:08:07.968 [2024-10-27 21:33:09.600454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.600479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.600529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000051 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.600542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.600592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000800 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.600605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.600656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000029 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.600668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.968 #38 NEW cov: 12383 ft: 14946 corp: 34/250b lim: 10 exec/s: 38 rss: 73Mb L: 8/10 MS: 1 ChangeBit- 00:08:07.968 [2024-10-27 21:33:09.660468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.660494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.660544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.660560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.660612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.660625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.968 [2024-10-27 21:33:09.660677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00001429 cdw11:00000000 00:08:07.968 [2024-10-27 21:33:09.660689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.226 #39 NEW cov: 12383 ft: 14950 corp: 35/258b lim: 10 exec/s: 39 rss: 74Mb L: 8/10 MS: 1 ChangeBit- 00:08:08.226 [2024-10-27 21:33:09.720151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:08.226 [2024-10-27 21:33:09.720177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.226 #40 NEW cov: 12383 ft: 14972 corp: 36/260b lim: 10 exec/s: 20 rss: 74Mb L: 2/10 MS: 1 ChangeByte- 00:08:08.226 #40 DONE cov: 12383 ft: 14972 corp: 36/260b lim: 10 exec/s: 20 rss: 74Mb 00:08:08.226 ###### Recommended dictionary. ###### 00:08:08.226 "\001\000\000\000\000\000\004\000" # Uses: 0 00:08:08.226 "\001\000\000\000\000\000\000\001" # Uses: 1 00:08:08.226 "/\001" # Uses: 0 00:08:08.226 "\377\377\377\377" # Uses: 0 00:08:08.226 ###### End of recommended dictionary. ###### 00:08:08.226 Done 40 runs in 2 second(s) 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.226 21:33:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:08.226 [2024-10-27 21:33:09.907863] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:08.226 [2024-10-27 21:33:09.907965] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3324532 ] 00:08:08.793 [2024-10-27 21:33:10.219019] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:08.793 [2024-10-27 21:33:10.266489] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.793 [2024-10-27 21:33:10.285074] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.793 [2024-10-27 21:33:10.337881] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.793 [2024-10-27 21:33:10.354191] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:08.793 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.793 INFO: Seed: 2211974936 00:08:08.793 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:08.793 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:08.793 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.793 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.793 #2 INITED exec/s: 0 rss: 64Mb 00:08:08.793 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.793 This may also happen if the target rejected all inputs we tried so far 00:08:08.793 [2024-10-27 21:33:10.409448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007a0a cdw11:00000000 00:08:08.793 [2024-10-27 21:33:10.409480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.052 NEW_FUNC[1/714]: 0x46a298 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:09.052 NEW_FUNC[2/714]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.052 #3 NEW cov: 12156 ft: 12143 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:09.052 [2024-10-27 21:33:10.750445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.052 [2024-10-27 21:33:10.750493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.052 [2024-10-27 21:33:10.750611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.052 [2024-10-27 21:33:10.750632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.052 #9 NEW cov: 12269 ft: 13113 corp: 3/8b lim: 10 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:09.310 [2024-10-27 21:33:10.800174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:09.310 [2024-10-27 21:33:10.800205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.310 #10 NEW cov: 12275 ft: 13349 corp: 4/10b lim: 10 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 InsertByte- 00:08:09.310 [2024-10-27 21:33:10.840202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a04 cdw11:00000000 00:08:09.310 [2024-10-27 21:33:10.840230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.310 #11 NEW cov: 12360 ft: 13641 corp: 5/12b lim: 10 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:09.310 [2024-10-27 21:33:10.910261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007a2a cdw11:00000000 00:08:09.310 [2024-10-27 21:33:10.910289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.310 #12 NEW cov: 12360 ft: 13819 corp: 6/14b lim: 10 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:09.310 [2024-10-27 21:33:10.980240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a26 cdw11:00000000 00:08:09.310 [2024-10-27 21:33:10.980268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.310 #13 NEW cov: 12360 ft: 13883 corp: 7/16b lim: 10 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 ChangeByte- 00:08:09.569 [2024-10-27 21:33:11.050677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.050706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 [2024-10-27 21:33:11.050817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.050834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.569 [2024-10-27 21:33:11.050951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a3f cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.050969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.569 #14 NEW cov: 12360 ft: 14133 corp: 8/22b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertByte- 00:08:09.569 [2024-10-27 21:33:11.120309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000109 cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.120335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 #17 NEW cov: 12360 ft: 14157 corp: 9/24b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 3 EraseBytes-ChangeBinInt-InsertByte- 00:08:09.569 [2024-10-27 21:33:11.190893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.190922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 [2024-10-27 21:33:11.191029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.191048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.569 [2024-10-27 21:33:11.191149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a3f cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.191167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.569 #18 NEW cov: 12360 ft: 14192 corp: 10/31b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 InsertByte- 00:08:09.569 [2024-10-27 21:33:11.260536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a04 cdw11:00000000 00:08:09.569 [2024-10-27 21:33:11.260564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.569 #19 NEW cov: 12360 ft: 14307 corp: 11/33b lim: 10 exec/s: 0 rss: 73Mb L: 2/7 MS: 1 CopyPart- 00:08:09.828 [2024-10-27 21:33:11.311177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:09.828 [2024-10-27 21:33:11.311206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 [2024-10-27 21:33:11.311314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:09.828 [2024-10-27 21:33:11.311332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.828 [2024-10-27 21:33:11.311441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:09.828 [2024-10-27 21:33:11.311464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.828 [2024-10-27 21:33:11.311579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00005e5e cdw11:00000000 00:08:09.828 [2024-10-27 21:33:11.311595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.828 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:09.828 #20 NEW cov: 12383 ft: 14573 corp: 12/42b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:09.828 [2024-10-27 21:33:11.360563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:09.828 [2024-10-27 21:33:11.360591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.828 #21 NEW cov: 12383 ft: 14594 corp: 13/45b lim: 10 exec/s: 0 rss: 73Mb L: 3/9 MS: 1 EraseBytes- 00:08:09.828 [2024-10-27 21:33:11.410565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000950a cdw11:00000000 00:08:09.828 [2024-10-27 21:33:11.410593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.829 #22 NEW cov: 12383 ft: 14638 corp: 14/48b lim: 10 exec/s: 22 rss: 73Mb L: 3/9 MS: 1 InsertByte- 00:08:09.829 [2024-10-27 21:33:11.480611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:09.829 [2024-10-27 21:33:11.480641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.829 #24 NEW cov: 12383 ft: 14696 corp: 15/50b lim: 10 exec/s: 24 rss: 73Mb L: 2/9 MS: 2 CopyPart-InsertByte- 00:08:09.829 [2024-10-27 21:33:11.530600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000950a cdw11:00000000 00:08:09.829 [2024-10-27 21:33:11.530628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #25 NEW cov: 12383 ft: 14760 corp: 16/53b lim: 10 exec/s: 25 rss: 73Mb L: 3/9 MS: 1 ChangeByte- 00:08:10.088 [2024-10-27 21:33:11.601161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e5 cdw11:00000000 00:08:10.088 [2024-10-27 21:33:11.601190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 [2024-10-27 21:33:11.601305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:10.088 [2024-10-27 21:33:11.601322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.088 [2024-10-27 21:33:11.601435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a3f cdw11:00000000 00:08:10.088 [2024-10-27 21:33:11.601452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.088 #26 NEW cov: 12383 ft: 14766 corp: 17/60b lim: 10 exec/s: 26 rss: 73Mb L: 7/9 MS: 1 ChangeBit- 00:08:10.088 [2024-10-27 21:33:11.670790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000109 cdw11:00000000 00:08:10.088 [2024-10-27 21:33:11.670819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #27 NEW cov: 12383 ft: 14792 corp: 18/62b lim: 10 exec/s: 27 rss: 73Mb L: 2/9 MS: 1 ShuffleBytes- 00:08:10.088 [2024-10-27 21:33:11.740821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:10.088 [2024-10-27 21:33:11.740852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.088 #28 NEW cov: 12383 ft: 14809 corp: 19/65b lim: 10 exec/s: 28 rss: 73Mb L: 3/9 MS: 1 ChangeBit- 00:08:10.088 [2024-10-27 21:33:11.810940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a19 cdw11:00000000 00:08:10.088 [2024-10-27 21:33:11.810973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 #29 NEW cov: 12383 ft: 14810 corp: 20/67b lim: 10 exec/s: 29 rss: 73Mb L: 2/9 MS: 1 InsertByte- 00:08:10.378 [2024-10-27 21:33:11.861801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.861829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:11.861936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.861967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:11.862078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.862095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:11.862204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.862221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:11.862336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.862356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.378 #30 NEW cov: 12383 ft: 14873 corp: 21/77b lim: 10 exec/s: 30 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:10.378 [2024-10-27 21:33:11.910948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000404 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.910975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 #31 NEW cov: 12383 ft: 14906 corp: 22/79b lim: 10 exec/s: 31 rss: 73Mb L: 2/10 MS: 1 CopyPart- 00:08:10.378 [2024-10-27 21:33:11.960991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ea7a cdw11:00000000 00:08:10.378 [2024-10-27 21:33:11.961018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 #32 NEW cov: 12383 ft: 14914 corp: 23/82b lim: 10 exec/s: 32 rss: 73Mb L: 3/10 MS: 1 InsertByte- 00:08:10.378 [2024-10-27 21:33:12.011850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.011877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:12.011997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009393 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.012014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:12.012127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009393 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.012145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:12.012247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00009393 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.012266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:12.012383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00009393 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.012399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.378 #33 NEW cov: 12383 ft: 14931 corp: 24/92b lim: 10 exec/s: 33 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:10.378 [2024-10-27 21:33:12.061359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007a98 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.061387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.378 [2024-10-27 21:33:12.061501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009898 cdw11:00000000 00:08:10.378 [2024-10-27 21:33:12.061518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.378 #34 NEW cov: 12383 ft: 14962 corp: 25/97b lim: 10 exec/s: 34 rss: 73Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:08:10.636 [2024-10-27 21:33:12.111191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:10.636 [2024-10-27 21:33:12.111218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.637 #35 NEW cov: 12383 ft: 14993 corp: 26/99b lim: 10 exec/s: 35 rss: 73Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:10.637 [2024-10-27 21:33:12.181248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a04 cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.181276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.637 [2024-10-27 21:33:12.251209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a23 cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.251236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.637 #37 NEW cov: 12383 ft: 15028 corp: 27/101b lim: 10 exec/s: 37 rss: 73Mb L: 2/10 MS: 2 ShuffleBytes-ChangeByte- 00:08:10.637 [2024-10-27 21:33:12.302141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.302168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.637 [2024-10-27 21:33:12.302284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.302301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.637 [2024-10-27 21:33:12.302407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.302424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.637 [2024-10-27 21:33:12.302540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.302557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.637 [2024-10-27 21:33:12.302659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff19 cdw11:00000000 00:08:10.637 [2024-10-27 21:33:12.302675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.637 #38 NEW cov: 12383 ft: 15053 corp: 28/111b lim: 10 exec/s: 38 rss: 73Mb L: 10/10 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:10.896 [2024-10-27 21:33:12.371613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:08:10.896 [2024-10-27 21:33:12.371642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.896 [2024-10-27 21:33:12.371766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003f0a cdw11:00000000 00:08:10.896 [2024-10-27 21:33:12.371786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.896 #39 NEW cov: 12383 ft: 15080 corp: 29/116b lim: 10 exec/s: 39 rss: 74Mb L: 5/10 MS: 1 CrossOver- 00:08:10.896 [2024-10-27 21:33:12.421338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000404 cdw11:00000000 00:08:10.896 [2024-10-27 21:33:12.421366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.896 #40 NEW cov: 12383 ft: 15121 corp: 30/118b lim: 10 exec/s: 20 rss: 74Mb L: 2/10 MS: 1 CopyPart- 00:08:10.896 #40 DONE cov: 12383 ft: 15121 corp: 30/118b lim: 10 exec/s: 20 rss: 74Mb 00:08:10.896 ###### Recommended dictionary. ###### 00:08:10.896 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:10.896 ###### End of recommended dictionary. ###### 00:08:10.896 Done 40 runs in 2 second(s) 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:10.896 21:33:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:10.896 [2024-10-27 21:33:12.607957] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:10.896 [2024-10-27 21:33:12.608028] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3324980 ] 00:08:11.461 [2024-10-27 21:33:12.929994] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:11.461 [2024-10-27 21:33:12.976141] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.461 [2024-10-27 21:33:12.996109] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.461 [2024-10-27 21:33:13.048752] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.461 [2024-10-27 21:33:13.065064] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:11.461 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.461 INFO: Seed: 627980152 00:08:11.461 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:11.461 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:11.461 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.461 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.461 [2024-10-27 21:33:13.110309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.461 [2024-10-27 21:33:13.110337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.461 #2 INITED cov: 12184 ft: 12163 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:11.461 [2024-10-27 21:33:13.150669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.461 [2024-10-27 21:33:13.150695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.461 [2024-10-27 21:33:13.150750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.461 [2024-10-27 21:33:13.150764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.461 [2024-10-27 21:33:13.150816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.461 [2024-10-27 21:33:13.150829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.461 [2024-10-27 21:33:13.150880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.461 [2024-10-27 21:33:13.150892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.719 #3 NEW cov: 12297 ft: 13431 corp: 2/5b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:11.719 [2024-10-27 21:33:13.210411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.210437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.210493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.210506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.719 #4 NEW cov: 12303 ft: 13892 corp: 3/7b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 InsertByte- 00:08:11.719 [2024-10-27 21:33:13.250730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.250754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.250812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.250828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.250885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.250899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.250955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.250969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.719 #5 NEW cov: 12388 ft: 14149 corp: 4/11b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CopyPart- 00:08:11.719 [2024-10-27 21:33:13.310289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.310313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.719 #6 NEW cov: 12388 ft: 14284 corp: 5/12b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 ChangeBit- 00:08:11.719 [2024-10-27 21:33:13.350447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.350473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.350532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.350545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.719 #7 NEW cov: 12388 ft: 14389 corp: 6/14b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 EraseBytes- 00:08:11.719 [2024-10-27 21:33:13.410805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.410830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.410886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.410900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.410960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.410973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.719 [2024-10-27 21:33:13.411027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.719 [2024-10-27 21:33:13.411041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.719 #8 NEW cov: 12388 ft: 14501 corp: 7/18b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:11.978 [2024-10-27 21:33:13.450452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.450477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.450537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.450550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.978 #9 NEW cov: 12388 ft: 14589 corp: 8/20b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeBit- 00:08:11.978 [2024-10-27 21:33:13.510833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.510858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.510917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.510931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.510991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.511005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.511059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.511072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.978 #10 NEW cov: 12388 ft: 14676 corp: 9/24b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:11.978 [2024-10-27 21:33:13.550320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.550345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.978 #11 NEW cov: 12388 ft: 14772 corp: 10/25b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 ShuffleBytes- 00:08:11.978 [2024-10-27 21:33:13.610502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.610527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.610582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.610596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.978 #12 NEW cov: 12388 ft: 14784 corp: 11/27b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:11.978 [2024-10-27 21:33:13.650565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.650589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.650644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.650658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.978 #13 NEW cov: 12388 ft: 14842 corp: 12/29b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 ChangeByte- 00:08:11.978 [2024-10-27 21:33:13.690584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.690608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.978 [2024-10-27 21:33:13.690663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.978 [2024-10-27 21:33:13.690676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.236 #14 NEW cov: 12388 ft: 14875 corp: 13/31b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 CopyPart- 00:08:12.236 [2024-10-27 21:33:13.750617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.750641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.236 [2024-10-27 21:33:13.750698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.750711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.236 #15 NEW cov: 12388 ft: 14896 corp: 14/33b lim: 5 exec/s: 0 rss: 71Mb L: 2/4 MS: 1 CopyPart- 00:08:12.236 [2024-10-27 21:33:13.790657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.790683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.236 [2024-10-27 21:33:13.790740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.790754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.236 #16 NEW cov: 12388 ft: 14922 corp: 15/35b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 ChangeByte- 00:08:12.236 [2024-10-27 21:33:13.850956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.850981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.236 [2024-10-27 21:33:13.851034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.851048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.236 [2024-10-27 21:33:13.851101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.851114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.236 [2024-10-27 21:33:13.851166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.851179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.236 #17 NEW cov: 12388 ft: 14931 corp: 16/39b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 ChangeByte- 00:08:12.236 [2024-10-27 21:33:13.890486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.890514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.236 #18 NEW cov: 12388 ft: 14953 corp: 17/40b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeBit- 00:08:12.236 [2024-10-27 21:33:13.930663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.930688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.236 [2024-10-27 21:33:13.930741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.236 [2024-10-27 21:33:13.930755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.236 #19 NEW cov: 12388 ft: 14978 corp: 18/42b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 EraseBytes- 00:08:12.494 [2024-10-27 21:33:13.970980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.494 [2024-10-27 21:33:13.971005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.494 [2024-10-27 21:33:13.971061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.494 [2024-10-27 21:33:13.971074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.494 [2024-10-27 21:33:13.971126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.494 [2024-10-27 21:33:13.971140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.494 [2024-10-27 21:33:13.971192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.494 [2024-10-27 21:33:13.971205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.752 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:12.752 #20 NEW cov: 12411 ft: 15065 corp: 19/46b lim: 5 exec/s: 20 rss: 73Mb L: 4/4 MS: 1 CrossOver- 00:08:12.752 [2024-10-27 21:33:14.280910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.280946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.281017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.281032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.752 #21 NEW cov: 12411 ft: 15106 corp: 20/48b lim: 5 exec/s: 21 rss: 73Mb L: 2/4 MS: 1 ChangeBinInt- 00:08:12.752 [2024-10-27 21:33:14.320987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.321013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.321072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.321085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.321145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.321158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.752 #22 NEW cov: 12411 ft: 15270 corp: 21/51b lim: 5 exec/s: 22 rss: 73Mb L: 3/4 MS: 1 EraseBytes- 00:08:12.752 [2024-10-27 21:33:14.381138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.381163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.381221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.381234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.381290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.381303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.381359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.381372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.752 #23 NEW cov: 12411 ft: 15301 corp: 22/55b lim: 5 exec/s: 23 rss: 73Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:12.752 [2024-10-27 21:33:14.420657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.420683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.752 #24 NEW cov: 12411 ft: 15303 corp: 23/56b lim: 5 exec/s: 24 rss: 73Mb L: 1/4 MS: 1 CrossOver- 00:08:12.752 [2024-10-27 21:33:14.460819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.460843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.752 [2024-10-27 21:33:14.460899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.752 [2024-10-27 21:33:14.460913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.010 #25 NEW cov: 12411 ft: 15309 corp: 24/58b lim: 5 exec/s: 25 rss: 73Mb L: 2/4 MS: 1 InsertByte- 00:08:13.010 [2024-10-27 21:33:14.520855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-10-27 21:33:14.520880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.010 [2024-10-27 21:33:14.520937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-10-27 21:33:14.520956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.010 #26 NEW cov: 12411 ft: 15327 corp: 25/60b lim: 5 exec/s: 26 rss: 73Mb L: 2/4 MS: 1 ChangeBinInt- 00:08:13.010 [2024-10-27 21:33:14.581078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.581103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.011 [2024-10-27 21:33:14.581164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.581177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.011 [2024-10-27 21:33:14.581252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.581266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.011 #27 NEW cov: 12411 ft: 15338 corp: 26/63b lim: 5 exec/s: 27 rss: 73Mb L: 3/4 MS: 1 CopyPart- 00:08:13.011 [2024-10-27 21:33:14.621278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.621304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.011 [2024-10-27 21:33:14.621363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.621376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.011 [2024-10-27 21:33:14.621435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.621449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.011 [2024-10-27 21:33:14.621507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.621520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.011 #28 NEW cov: 12411 ft: 15370 corp: 27/67b lim: 5 exec/s: 28 rss: 73Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:13.011 [2024-10-27 21:33:14.680949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.680975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.011 [2024-10-27 21:33:14.681032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.011 [2024-10-27 21:33:14.681045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.011 #29 NEW cov: 12411 ft: 15386 corp: 28/69b lim: 5 exec/s: 29 rss: 73Mb L: 2/4 MS: 1 ChangeByte- 00:08:13.270 [2024-10-27 21:33:14.741162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.741188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.741245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.741258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.741320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.741333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.270 #30 NEW cov: 12411 ft: 15391 corp: 29/72b lim: 5 exec/s: 30 rss: 73Mb L: 3/4 MS: 1 EraseBytes- 00:08:13.270 [2024-10-27 21:33:14.781008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.781033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.781089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.781103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.270 #31 NEW cov: 12411 ft: 15469 corp: 30/74b lim: 5 exec/s: 31 rss: 74Mb L: 2/4 MS: 1 EraseBytes- 00:08:13.270 [2024-10-27 21:33:14.841075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.841101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.841174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.841188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.270 #32 NEW cov: 12411 ft: 15480 corp: 31/76b lim: 5 exec/s: 32 rss: 74Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:13.270 [2024-10-27 21:33:14.901358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.901383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.901442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.901456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.901514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.901527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.901586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.901600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.270 #33 NEW cov: 12411 ft: 15484 corp: 32/80b lim: 5 exec/s: 33 rss: 74Mb L: 4/4 MS: 1 ChangeByte- 00:08:13.270 [2024-10-27 21:33:14.941364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.941389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.941447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.941463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.941521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.941534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.941592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.941605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.270 #34 NEW cov: 12411 ft: 15498 corp: 33/84b lim: 5 exec/s: 34 rss: 74Mb L: 4/4 MS: 1 CMP- DE: "\377\377\377\036"- 00:08:13.270 [2024-10-27 21:33:14.981042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.981066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.270 [2024-10-27 21:33:14.981124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.270 [2024-10-27 21:33:14.981137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.529 #35 NEW cov: 12411 ft: 15515 corp: 34/86b lim: 5 exec/s: 35 rss: 74Mb L: 2/4 MS: 1 CopyPart- 00:08:13.530 [2024-10-27 21:33:15.021096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.530 [2024-10-27 21:33:15.021120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.530 [2024-10-27 21:33:15.021180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.530 [2024-10-27 21:33:15.021193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.530 #36 NEW cov: 12411 ft: 15522 corp: 35/88b lim: 5 exec/s: 36 rss: 74Mb L: 2/4 MS: 1 CrossOver- 00:08:13.530 [2024-10-27 21:33:15.081305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.530 [2024-10-27 21:33:15.081330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.530 [2024-10-27 21:33:15.081387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.530 [2024-10-27 21:33:15.081401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.530 [2024-10-27 21:33:15.081457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.530 [2024-10-27 21:33:15.081471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.530 #37 NEW cov: 12411 ft: 15543 corp: 36/91b lim: 5 exec/s: 18 rss: 74Mb L: 3/4 MS: 1 InsertByte- 00:08:13.530 #37 DONE cov: 12411 ft: 15543 corp: 36/91b lim: 5 exec/s: 18 rss: 74Mb 00:08:13.530 ###### Recommended dictionary. ###### 00:08:13.530 "\377\377\377\036" # Uses: 0 00:08:13.530 ###### End of recommended dictionary. ###### 00:08:13.530 Done 37 runs in 2 second(s) 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.530 21:33:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:13.530 [2024-10-27 21:33:15.246275] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:13.530 [2024-10-27 21:33:15.246354] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3325515 ] 00:08:14.097 [2024-10-27 21:33:15.562783] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:14.098 [2024-10-27 21:33:15.610082] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.098 [2024-10-27 21:33:15.632864] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.098 [2024-10-27 21:33:15.685148] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:14.098 [2024-10-27 21:33:15.701470] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:14.098 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.098 INFO: Seed: 3264976296 00:08:14.098 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:14.098 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:14.098 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.098 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.098 [2024-10-27 21:33:15.746121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.098 [2024-10-27 21:33:15.746155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 #2 INITED cov: 12184 ft: 12176 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:14.098 [2024-10-27 21:33:15.796187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.098 [2024-10-27 21:33:15.796223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 [2024-10-27 21:33:15.796273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.098 [2024-10-27 21:33:15.796290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.098 [2024-10-27 21:33:15.796320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.098 [2024-10-27 21:33:15.796336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.098 [2024-10-27 21:33:15.796366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.098 [2024-10-27 21:33:15.796382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.356 #3 NEW cov: 12297 ft: 13646 corp: 2/5b lim: 5 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:14.356 [2024-10-27 21:33:15.886204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.886235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:15.886283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.886299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:15.886328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.886344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:15.886372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.886387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.356 #4 NEW cov: 12303 ft: 13749 corp: 3/9b lim: 5 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:14.356 [2024-10-27 21:33:15.936191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.936222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:15.936253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.936269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:15.936296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.936311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:15.936338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:15.936356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.356 #5 NEW cov: 12388 ft: 14073 corp: 4/13b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:14.356 [2024-10-27 21:33:16.026265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:16.026295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:16.026343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:16.026359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:16.026388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:16.026404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:16.026433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:16.026448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.356 [2024-10-27 21:33:16.026476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.356 [2024-10-27 21:33:16.026491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.614 #6 NEW cov: 12388 ft: 14299 corp: 5/18b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertByte- 00:08:14.614 [2024-10-27 21:33:16.116159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.116190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.116238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.116253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.614 #7 NEW cov: 12388 ft: 14610 corp: 6/20b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 CopyPart- 00:08:14.614 [2024-10-27 21:33:16.176143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.176174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.176208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.176223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.614 #8 NEW cov: 12388 ft: 14674 corp: 7/22b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 InsertByte- 00:08:14.614 [2024-10-27 21:33:16.236321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.236350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.236403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.236418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.236448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.236463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.236492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.236506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.236535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.236549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.614 #9 NEW cov: 12388 ft: 14695 corp: 8/27b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:14.614 [2024-10-27 21:33:16.326377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.326407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.326454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.326470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.614 [2024-10-27 21:33:16.326499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.614 [2024-10-27 21:33:16.326515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.615 [2024-10-27 21:33:16.326543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.615 [2024-10-27 21:33:16.326558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.615 [2024-10-27 21:33:16.326587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.615 [2024-10-27 21:33:16.326601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.873 #10 NEW cov: 12388 ft: 14749 corp: 9/32b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:14.873 [2024-10-27 21:33:16.416201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.416230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.873 [2024-10-27 21:33:16.416277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.416293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.873 #11 NEW cov: 12388 ft: 14818 corp: 10/34b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 ChangeBit- 00:08:14.873 [2024-10-27 21:33:16.506345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.506374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.873 [2024-10-27 21:33:16.506422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.506437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.873 [2024-10-27 21:33:16.506466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.506481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.873 [2024-10-27 21:33:16.506510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.506525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.873 #12 NEW cov: 12388 ft: 14940 corp: 11/38b lim: 5 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:14.873 [2024-10-27 21:33:16.566213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.566242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.873 [2024-10-27 21:33:16.566289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.873 [2024-10-27 21:33:16.566304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.390 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:15.390 #13 NEW cov: 12411 ft: 14970 corp: 12/40b lim: 5 exec/s: 13 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:15.390 [2024-10-27 21:33:16.917688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:16.917740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.390 [2024-10-27 21:33:16.917821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:16.917846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.390 [2024-10-27 21:33:16.917923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:16.917953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.390 [2024-10-27 21:33:16.918030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:16.918054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.390 #14 NEW cov: 12411 ft: 15128 corp: 13/44b lim: 5 exec/s: 14 rss: 72Mb L: 4/5 MS: 1 EraseBytes- 00:08:15.390 [2024-10-27 21:33:16.987019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:16.987045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.390 [2024-10-27 21:33:16.987101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:16.987114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.390 #15 NEW cov: 12411 ft: 15250 corp: 14/46b lim: 5 exec/s: 15 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:15.390 [2024-10-27 21:33:17.047069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.390 [2024-10-27 21:33:17.047094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.391 [2024-10-27 21:33:17.047147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.391 [2024-10-27 21:33:17.047160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.391 #16 NEW cov: 12411 ft: 15309 corp: 15/48b lim: 5 exec/s: 16 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:15.391 [2024-10-27 21:33:17.087041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.391 [2024-10-27 21:33:17.087066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.391 [2024-10-27 21:33:17.087122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.391 [2024-10-27 21:33:17.087135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.391 #17 NEW cov: 12411 ft: 15348 corp: 16/50b lim: 5 exec/s: 17 rss: 72Mb L: 2/5 MS: 1 InsertByte- 00:08:15.649 [2024-10-27 21:33:17.127352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.127377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.649 [2024-10-27 21:33:17.127432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.127446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.649 [2024-10-27 21:33:17.127501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.127514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.649 [2024-10-27 21:33:17.127568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.127581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.649 #18 NEW cov: 12411 ft: 15375 corp: 17/54b lim: 5 exec/s: 18 rss: 72Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:15.649 [2024-10-27 21:33:17.187443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.187471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.649 [2024-10-27 21:33:17.187527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.187541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.649 [2024-10-27 21:33:17.187594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.187624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.649 [2024-10-27 21:33:17.187678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.649 [2024-10-27 21:33:17.187691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.649 #19 NEW cov: 12411 ft: 15392 corp: 18/58b lim: 5 exec/s: 19 rss: 72Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:15.650 [2024-10-27 21:33:17.227579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.227605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.227659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.227673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.227727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.227740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.227795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.227808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.227861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.227874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.650 #20 NEW cov: 12411 ft: 15400 corp: 19/63b lim: 5 exec/s: 20 rss: 72Mb L: 5/5 MS: 1 CopyPart- 00:08:15.650 [2024-10-27 21:33:17.267470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.267495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.267552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.267566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.267620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.267636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.267687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.267700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.650 #21 NEW cov: 12411 ft: 15437 corp: 20/67b lim: 5 exec/s: 21 rss: 72Mb L: 4/5 MS: 1 CMP- DE: "\022\000"- 00:08:15.650 [2024-10-27 21:33:17.327456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.327482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.327538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.327551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.327607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.327620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.327675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.327688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.650 #22 NEW cov: 12411 ft: 15457 corp: 21/71b lim: 5 exec/s: 22 rss: 72Mb L: 4/5 MS: 1 ChangeByte- 00:08:15.650 [2024-10-27 21:33:17.367194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.367220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.650 [2024-10-27 21:33:17.367274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.650 [2024-10-27 21:33:17.367287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.908 #23 NEW cov: 12411 ft: 15475 corp: 22/73b lim: 5 exec/s: 23 rss: 72Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:15.908 [2024-10-27 21:33:17.427695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.427720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.427777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.427791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.427845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.427858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.427912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.427928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.427982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.427996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.908 #24 NEW cov: 12411 ft: 15486 corp: 23/78b lim: 5 exec/s: 24 rss: 72Mb L: 5/5 MS: 1 ChangeByte- 00:08:15.908 [2024-10-27 21:33:17.467368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.467393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.467452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.467465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.467536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.467550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.908 #25 NEW cov: 12411 ft: 15651 corp: 24/81b lim: 5 exec/s: 25 rss: 72Mb L: 3/5 MS: 1 CrossOver- 00:08:15.908 [2024-10-27 21:33:17.507582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.507608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.507665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.908 [2024-10-27 21:33:17.507679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.908 [2024-10-27 21:33:17.507733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.507746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.909 [2024-10-27 21:33:17.507799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.507812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.909 #26 NEW cov: 12411 ft: 15687 corp: 25/85b lim: 5 exec/s: 26 rss: 72Mb L: 4/5 MS: 1 CrossOver- 00:08:15.909 [2024-10-27 21:33:17.547758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.547783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.909 [2024-10-27 21:33:17.547839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.547852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.909 [2024-10-27 21:33:17.547906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.547923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.909 [2024-10-27 21:33:17.547985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.547998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.909 [2024-10-27 21:33:17.548051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.548064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.909 #27 NEW cov: 12411 ft: 15725 corp: 26/90b lim: 5 exec/s: 27 rss: 72Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:15.909 [2024-10-27 21:33:17.607274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.607299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.909 [2024-10-27 21:33:17.607355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.909 [2024-10-27 21:33:17.607369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.909 #28 NEW cov: 12411 ft: 15736 corp: 27/92b lim: 5 exec/s: 28 rss: 72Mb L: 2/5 MS: 1 ChangeBit- 00:08:16.168 [2024-10-27 21:33:17.647792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.647818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.647873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.647887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.647944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.647958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.648012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.648025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.648079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.648092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.168 #29 NEW cov: 12411 ft: 15781 corp: 28/97b lim: 5 exec/s: 29 rss: 72Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:16.168 [2024-10-27 21:33:17.687304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.687328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.687386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.687400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.168 #30 NEW cov: 12411 ft: 15800 corp: 29/99b lim: 5 exec/s: 30 rss: 72Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:16.168 [2024-10-27 21:33:17.727653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.727677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.727732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.727746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.727799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.727829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.168 [2024-10-27 21:33:17.727883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.168 [2024-10-27 21:33:17.727896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.168 #31 NEW cov: 12411 ft: 15811 corp: 30/103b lim: 5 exec/s: 15 rss: 72Mb L: 4/5 MS: 1 PersAutoDict- DE: "\022\000"- 00:08:16.168 #31 DONE cov: 12411 ft: 15811 corp: 30/103b lim: 5 exec/s: 15 rss: 72Mb 00:08:16.168 ###### Recommended dictionary. ###### 00:08:16.168 "\022\000" # Uses: 1 00:08:16.168 ###### End of recommended dictionary. ###### 00:08:16.168 Done 31 runs in 2 second(s) 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.168 21:33:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:16.426 [2024-10-27 21:33:17.895186] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:16.426 [2024-10-27 21:33:17.895262] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3326044 ] 00:08:16.684 [2024-10-27 21:33:18.208390] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:16.684 [2024-10-27 21:33:18.254394] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.684 [2024-10-27 21:33:18.273542] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.684 [2024-10-27 21:33:18.325891] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.684 [2024-10-27 21:33:18.342204] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:16.684 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.684 INFO: Seed: 1609011576 00:08:16.684 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:16.684 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:16.684 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.684 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.684 #2 INITED exec/s: 0 rss: 64Mb 00:08:16.684 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.684 This may also happen if the target rejected all inputs we tried so far 00:08:16.684 [2024-10-27 21:33:18.387035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.684 [2024-10-27 21:33:18.387070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.684 [2024-10-27 21:33:18.387120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.684 [2024-10-27 21:33:18.387137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.684 [2024-10-27 21:33:18.387168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.684 [2024-10-27 21:33:18.387185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.684 [2024-10-27 21:33:18.387216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.684 [2024-10-27 21:33:18.387231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.201 NEW_FUNC[1/715]: 0x46bc18 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:17.201 NEW_FUNC[2/715]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:17.201 #4 NEW cov: 12207 ft: 12204 corp: 2/35b lim: 40 exec/s: 0 rss: 71Mb L: 34/34 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:17.201 [2024-10-27 21:33:18.726894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.726931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.201 #8 NEW cov: 12320 ft: 13404 corp: 3/45b lim: 40 exec/s: 0 rss: 72Mb L: 10/34 MS: 4 ChangeBit-ShuffleBytes-ChangeBit-CrossOver- 00:08:17.201 [2024-10-27 21:33:18.786878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.786909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.201 [2024-10-27 21:33:18.786968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.786984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.201 [2024-10-27 21:33:18.787015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.787031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.201 #9 NEW cov: 12326 ft: 13855 corp: 4/76b lim: 40 exec/s: 0 rss: 72Mb L: 31/34 MS: 1 EraseBytes- 00:08:17.201 [2024-10-27 21:33:18.877034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.877066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.201 [2024-10-27 21:33:18.877101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.877116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.201 [2024-10-27 21:33:18.877147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.877163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.201 [2024-10-27 21:33:18.877193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.201 [2024-10-27 21:33:18.877208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.460 #10 NEW cov: 12411 ft: 14118 corp: 5/108b lim: 40 exec/s: 0 rss: 72Mb L: 32/34 MS: 1 InsertByte- 00:08:17.460 [2024-10-27 21:33:18.967005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.460 [2024-10-27 21:33:18.967036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.460 [2024-10-27 21:33:18.967085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.460 [2024-10-27 21:33:18.967101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.460 [2024-10-27 21:33:18.967131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.460 [2024-10-27 21:33:18.967147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.460 [2024-10-27 21:33:18.967176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.460 [2024-10-27 21:33:18.967196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.460 #11 NEW cov: 12411 ft: 14269 corp: 6/142b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 CopyPart- 00:08:17.460 [2024-10-27 21:33:19.026847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.460 [2024-10-27 21:33:19.026878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.460 #12 NEW cov: 12411 ft: 14323 corp: 7/152b lim: 40 exec/s: 0 rss: 72Mb L: 10/34 MS: 1 ShuffleBytes- 00:08:17.460 [2024-10-27 21:33:19.116880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:01000082 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.460 [2024-10-27 21:33:19.116911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.718 #13 NEW cov: 12411 ft: 14419 corp: 8/162b lim: 40 exec/s: 0 rss: 72Mb L: 10/34 MS: 1 CMP- DE: "\001\000\000\202"- 00:08:17.718 [2024-10-27 21:33:19.206914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.206950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.718 #14 NEW cov: 12411 ft: 14484 corp: 9/174b lim: 40 exec/s: 0 rss: 72Mb L: 12/34 MS: 1 CopyPart- 00:08:17.718 [2024-10-27 21:33:19.266888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0b797979 cdw11:79797979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.266919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.718 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:17.718 #17 NEW cov: 12434 ft: 14524 corp: 10/186b lim: 40 exec/s: 0 rss: 72Mb L: 12/34 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:17.718 [2024-10-27 21:33:19.326899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.326931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.718 #18 NEW cov: 12434 ft: 14564 corp: 11/196b lim: 40 exec/s: 0 rss: 72Mb L: 10/34 MS: 1 ChangeBit- 00:08:17.718 [2024-10-27 21:33:19.387133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.387163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.718 [2024-10-27 21:33:19.387213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.387229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.718 [2024-10-27 21:33:19.387259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.387275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.718 [2024-10-27 21:33:19.387305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffff1eff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.387321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.718 #19 NEW cov: 12434 ft: 14593 corp: 12/230b lim: 40 exec/s: 19 rss: 72Mb L: 34/34 MS: 1 ChangeByte- 00:08:17.718 [2024-10-27 21:33:19.436931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.718 [2024-10-27 21:33:19.436967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.977 #20 NEW cov: 12434 ft: 14643 corp: 13/239b lim: 40 exec/s: 20 rss: 72Mb L: 9/34 MS: 1 CrossOver- 00:08:17.977 [2024-10-27 21:33:19.527025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0b6affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.527054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.977 #21 NEW cov: 12434 ft: 14712 corp: 14/251b lim: 40 exec/s: 21 rss: 72Mb L: 12/34 MS: 1 CrossOver- 00:08:17.977 [2024-10-27 21:33:19.617189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.617219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.977 [2024-10-27 21:33:19.617254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.617270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.977 [2024-10-27 21:33:19.617300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.617316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.977 [2024-10-27 21:33:19.617346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.617361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.977 #22 NEW cov: 12434 ft: 14724 corp: 15/286b lim: 40 exec/s: 22 rss: 73Mb L: 35/35 MS: 1 InsertByte- 00:08:17.977 [2024-10-27 21:33:19.667140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.667169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.977 [2024-10-27 21:33:19.667218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.667234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.977 [2024-10-27 21:33:19.667264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:03ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.667280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.977 [2024-10-27 21:33:19.667310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.977 [2024-10-27 21:33:19.667325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.235 #23 NEW cov: 12434 ft: 14752 corp: 16/321b lim: 40 exec/s: 23 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:18.235 [2024-10-27 21:33:19.757272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.235 [2024-10-27 21:33:19.757308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.235 [2024-10-27 21:33:19.757343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.235 [2024-10-27 21:33:19.757360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.235 [2024-10-27 21:33:19.757391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.235 [2024-10-27 21:33:19.757407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.236 [2024-10-27 21:33:19.757438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffff7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.236 [2024-10-27 21:33:19.757453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.236 #24 NEW cov: 12434 ft: 14759 corp: 17/355b lim: 40 exec/s: 24 rss: 73Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:18.236 [2024-10-27 21:33:19.807125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.236 [2024-10-27 21:33:19.807158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.236 [2024-10-27 21:33:19.807193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.236 [2024-10-27 21:33:19.807210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.236 #25 NEW cov: 12434 ft: 15005 corp: 18/377b lim: 40 exec/s: 25 rss: 73Mb L: 22/35 MS: 1 CrossOver- 00:08:18.236 [2024-10-27 21:33:19.867100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ff79ef10 cdw11:da134ace SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.236 [2024-10-27 21:33:19.867131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.236 #26 NEW cov: 12434 ft: 15064 corp: 19/389b lim: 40 exec/s: 26 rss: 73Mb L: 12/35 MS: 1 CMP- DE: "\377y\357\020\332\023J\316"- 00:08:18.236 [2024-10-27 21:33:19.927172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6aff79ef cdw11:10da134a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.236 [2024-10-27 21:33:19.927202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.236 [2024-10-27 21:33:19.927253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ce797979 cdw11:79ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.236 [2024-10-27 21:33:19.927269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.495 #27 NEW cov: 12434 ft: 15079 corp: 20/410b lim: 40 exec/s: 27 rss: 73Mb L: 21/35 MS: 1 CrossOver- 00:08:18.495 [2024-10-27 21:33:20.017223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0b796679 cdw11:79797979 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.017258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.495 #28 NEW cov: 12434 ft: 15086 corp: 21/422b lim: 40 exec/s: 28 rss: 73Mb L: 12/35 MS: 1 ChangeByte- 00:08:18.495 [2024-10-27 21:33:20.077396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.077438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.495 [2024-10-27 21:33:20.077475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.077492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.495 [2024-10-27 21:33:20.077524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0100 cdw11:0082ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.077540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.495 [2024-10-27 21:33:20.077571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.077587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.495 #29 NEW cov: 12434 ft: 15111 corp: 22/460b lim: 40 exec/s: 29 rss: 73Mb L: 38/38 MS: 1 PersAutoDict- DE: "\001\000\000\202"- 00:08:18.495 [2024-10-27 21:33:20.177426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.177462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.495 [2024-10-27 21:33:20.177498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.177515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.495 [2024-10-27 21:33:20.177546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.177562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.495 [2024-10-27 21:33:20.177593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff1eff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.495 [2024-10-27 21:33:20.177609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.754 #30 NEW cov: 12434 ft: 15131 corp: 23/498b lim: 40 exec/s: 30 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:18.754 [2024-10-27 21:33:20.267240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:6affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.754 [2024-10-27 21:33:20.267272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.754 #31 NEW cov: 12434 ft: 15159 corp: 24/512b lim: 40 exec/s: 31 rss: 73Mb L: 14/38 MS: 1 CrossOver- 00:08:18.754 [2024-10-27 21:33:20.357313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.754 [2024-10-27 21:33:20.357345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.754 [2024-10-27 21:33:20.357379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.754 [2024-10-27 21:33:20.357395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.754 #32 pulse cov: 12434 ft: 15195 corp: 24/512b lim: 40 exec/s: 16 rss: 73Mb 00:08:18.754 #32 NEW cov: 12434 ft: 15195 corp: 25/534b lim: 40 exec/s: 16 rss: 73Mb L: 22/38 MS: 1 EraseBytes- 00:08:18.754 #32 DONE cov: 12434 ft: 15195 corp: 25/534b lim: 40 exec/s: 16 rss: 73Mb 00:08:18.754 ###### Recommended dictionary. ###### 00:08:18.754 "\001\000\000\202" # Uses: 1 00:08:18.754 "\377y\357\020\332\023J\316" # Uses: 0 00:08:18.754 ###### End of recommended dictionary. ###### 00:08:18.754 Done 32 runs in 2 second(s) 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:19.013 21:33:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:19.013 [2024-10-27 21:33:20.546123] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:19.013 [2024-10-27 21:33:20.546199] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3326375 ] 00:08:19.284 [2024-10-27 21:33:20.866594] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:19.284 [2024-10-27 21:33:20.913776] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.284 [2024-10-27 21:33:20.934040] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.284 [2024-10-27 21:33:20.986661] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.586 [2024-10-27 21:33:21.002853] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:19.586 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.586 INFO: Seed: 4271033778 00:08:19.586 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:19.586 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:19.586 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.586 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.586 #2 INITED exec/s: 0 rss: 65Mb 00:08:19.586 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.586 This may also happen if the target rejected all inputs we tried so far 00:08:19.586 [2024-10-27 21:33:21.048163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.586 [2024-10-27 21:33:21.048193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.870 NEW_FUNC[1/716]: 0x46d988 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:19.870 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.870 #27 NEW cov: 12219 ft: 12209 corp: 2/11b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 5 ShuffleBytes-InsertByte-ChangeBinInt-ChangeBinInt-InsertRepeatedBytes- 00:08:19.870 [2024-10-27 21:33:21.378728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.378759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.378817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.378831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.378886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.378899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.378957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.378970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.870 #40 NEW cov: 12332 ft: 13570 corp: 3/48b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:19.870 [2024-10-27 21:33:21.418525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.418551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.418626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.418640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.418699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.418712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.870 #41 NEW cov: 12338 ft: 14130 corp: 4/73b lim: 40 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 EraseBytes- 00:08:19.870 [2024-10-27 21:33:21.478529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.478555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.478633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.478651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.478711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.478724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.870 #45 NEW cov: 12423 ft: 14351 corp: 5/103b lim: 40 exec/s: 0 rss: 72Mb L: 30/37 MS: 4 ShuffleBytes-CrossOver-CMP-InsertRepeatedBytes- DE: "\377\377"- 00:08:19.870 [2024-10-27 21:33:21.518706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.518732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.518792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.518806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.518864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.518877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.870 [2024-10-27 21:33:21.518935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.518952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.870 #46 NEW cov: 12423 ft: 14436 corp: 6/140b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:19.870 [2024-10-27 21:33:21.558285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.870 [2024-10-27 21:33:21.558310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.140 #47 NEW cov: 12423 ft: 14519 corp: 7/150b lim: 40 exec/s: 0 rss: 73Mb L: 10/37 MS: 1 ChangeBit- 00:08:20.140 [2024-10-27 21:33:21.618293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.618319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.140 #48 NEW cov: 12423 ft: 14623 corp: 8/165b lim: 40 exec/s: 0 rss: 73Mb L: 15/37 MS: 1 CopyPart- 00:08:20.140 [2024-10-27 21:33:21.678773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.678799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.678873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.678887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.678948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.678962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.679020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.679033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.140 #49 NEW cov: 12423 ft: 14693 corp: 9/200b lim: 40 exec/s: 0 rss: 73Mb L: 35/37 MS: 1 CrossOver- 00:08:20.140 [2024-10-27 21:33:21.738346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.738371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.140 #50 NEW cov: 12423 ft: 14796 corp: 10/210b lim: 40 exec/s: 0 rss: 73Mb L: 10/37 MS: 1 ChangeBinInt- 00:08:20.140 [2024-10-27 21:33:21.778764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.778789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.778867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.778882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.778947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.778960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.779018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.779032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.140 #51 NEW cov: 12423 ft: 14823 corp: 11/245b lim: 40 exec/s: 0 rss: 73Mb L: 35/37 MS: 1 CopyPart- 00:08:20.140 [2024-10-27 21:33:21.838818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.838842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.838919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.838933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.838995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.839008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.140 [2024-10-27 21:33:21.839076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fbffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.140 [2024-10-27 21:33:21.839090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.140 #52 NEW cov: 12423 ft: 14843 corp: 12/280b lim: 40 exec/s: 0 rss: 73Mb L: 35/37 MS: 1 CrossOver- 00:08:20.400 [2024-10-27 21:33:21.878794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.878822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.878897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.878912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.878975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.878989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.879049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.879063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.400 #53 NEW cov: 12423 ft: 14878 corp: 13/316b lim: 40 exec/s: 0 rss: 73Mb L: 36/37 MS: 1 InsertRepeatedBytes- 00:08:20.400 [2024-10-27 21:33:21.938840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.938865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.938926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.938940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.939018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.939033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.939091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.939104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.400 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:20.400 #54 NEW cov: 12446 ft: 14910 corp: 14/348b lim: 40 exec/s: 0 rss: 73Mb L: 32/37 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:20.400 [2024-10-27 21:33:21.978715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.978740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.978800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffeff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.978814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:21.978874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:21.978887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.400 #55 NEW cov: 12446 ft: 14939 corp: 15/378b lim: 40 exec/s: 0 rss: 73Mb L: 30/37 MS: 1 ChangeBit- 00:08:20.400 [2024-10-27 21:33:22.018936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.018969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:22.019047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.019061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:22.019130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:0100ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.019143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:22.019201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.019214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.400 #56 NEW cov: 12446 ft: 14970 corp: 16/410b lim: 40 exec/s: 56 rss: 73Mb L: 32/37 MS: 1 CMP- DE: "\001\000"- 00:08:20.400 [2024-10-27 21:33:22.058828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.058854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:22.058929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff8eff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.058951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:22.059008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.059022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.400 #57 NEW cov: 12446 ft: 14979 corp: 17/436b lim: 40 exec/s: 57 rss: 73Mb L: 26/37 MS: 1 InsertByte- 00:08:20.400 [2024-10-27 21:33:22.098639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a3a3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.098665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.400 [2024-10-27 21:33:22.098721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a3a3a3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.400 [2024-10-27 21:33:22.098734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.400 #60 NEW cov: 12446 ft: 15182 corp: 18/459b lim: 40 exec/s: 60 rss: 73Mb L: 23/37 MS: 3 CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:08:20.660 [2024-10-27 21:33:22.138513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.138539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.660 #61 NEW cov: 12446 ft: 15233 corp: 19/469b lim: 40 exec/s: 61 rss: 73Mb L: 10/37 MS: 1 ShuffleBytes- 00:08:20.660 [2024-10-27 21:33:22.198560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3a3a3a3a cdw11:3a3a3a3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.198586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.660 #63 NEW cov: 12446 ft: 15257 corp: 20/479b lim: 40 exec/s: 63 rss: 73Mb L: 10/37 MS: 2 ShuffleBytes-CrossOver- 00:08:20.660 [2024-10-27 21:33:22.238902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.238928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.239002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.239016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.239074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.239098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.660 #64 NEW cov: 12446 ft: 15294 corp: 21/505b lim: 40 exec/s: 64 rss: 73Mb L: 26/37 MS: 1 EraseBytes- 00:08:20.660 [2024-10-27 21:33:22.279057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.279083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.279142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.279156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.279216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.279229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.279288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.279300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.660 #65 NEW cov: 12446 ft: 15300 corp: 22/539b lim: 40 exec/s: 65 rss: 73Mb L: 34/37 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:20.660 [2024-10-27 21:33:22.339326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.339352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.339428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.339442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.339499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.339513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.339572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.339585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.339644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.339657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.660 #66 NEW cov: 12446 ft: 15396 corp: 23/579b lim: 40 exec/s: 66 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:20.660 [2024-10-27 21:33:22.379126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.379151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.379227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.379242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.379302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fffffff7 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.379315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.660 [2024-10-27 21:33:22.379375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.660 [2024-10-27 21:33:22.379388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.919 #67 NEW cov: 12446 ft: 15402 corp: 24/614b lim: 40 exec/s: 67 rss: 73Mb L: 35/40 MS: 1 ChangeBit- 00:08:20.919 [2024-10-27 21:33:22.419112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.419139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.419196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.419209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.419266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.419279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.419335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.419348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.919 #68 NEW cov: 12446 ft: 15410 corp: 25/652b lim: 40 exec/s: 68 rss: 73Mb L: 38/40 MS: 1 InsertByte- 00:08:20.919 [2024-10-27 21:33:22.459148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.459174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.459231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffc3ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.459244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.459318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.459331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.459386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.459399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.919 #69 NEW cov: 12446 ft: 15420 corp: 26/685b lim: 40 exec/s: 69 rss: 73Mb L: 33/40 MS: 1 InsertByte- 00:08:20.919 [2024-10-27 21:33:22.498728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:08ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.498753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.919 #70 NEW cov: 12446 ft: 15452 corp: 27/695b lim: 40 exec/s: 70 rss: 73Mb L: 10/40 MS: 1 ChangeBinInt- 00:08:20.919 [2024-10-27 21:33:22.539047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.539073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.919 [2024-10-27 21:33:22.539149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.919 [2024-10-27 21:33:22.539163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.920 [2024-10-27 21:33:22.539223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.920 [2024-10-27 21:33:22.539236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.920 #71 NEW cov: 12446 ft: 15462 corp: 28/721b lim: 40 exec/s: 71 rss: 73Mb L: 26/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:08:20.920 [2024-10-27 21:33:22.598921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.920 [2024-10-27 21:33:22.598952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.920 [2024-10-27 21:33:22.599010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.920 [2024-10-27 21:33:22.599024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.920 #72 NEW cov: 12446 ft: 15480 corp: 29/741b lim: 40 exec/s: 72 rss: 74Mb L: 20/40 MS: 1 EraseBytes- 00:08:21.179 [2024-10-27 21:33:22.659254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.659279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.659337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff01ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.659350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.659409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.659426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.659485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.659498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.179 #73 NEW cov: 12446 ft: 15515 corp: 30/778b lim: 40 exec/s: 73 rss: 74Mb L: 37/40 MS: 1 ChangeBinInt- 00:08:21.179 [2024-10-27 21:33:22.699474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.699500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.699556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:7fffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.699569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.699624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.699637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.699692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.699704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.699759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.699772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.179 #74 NEW cov: 12446 ft: 15518 corp: 31/818b lim: 40 exec/s: 74 rss: 74Mb L: 40/40 MS: 1 CMP- DE: "\001\000\000\177"- 00:08:21.179 [2024-10-27 21:33:22.758863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.758888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.179 #75 NEW cov: 12446 ft: 15529 corp: 32/828b lim: 40 exec/s: 75 rss: 74Mb L: 10/40 MS: 1 CrossOver- 00:08:21.179 [2024-10-27 21:33:22.799388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.799413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.799471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3a3a3a3a cdw11:3a3a3a3a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.799485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.799542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.799555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.799614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.799631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.179 #76 NEW cov: 12446 ft: 15537 corp: 33/862b lim: 40 exec/s: 76 rss: 74Mb L: 34/40 MS: 1 CrossOver- 00:08:21.179 [2024-10-27 21:33:22.858885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a29ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.858911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.179 #77 NEW cov: 12446 ft: 15547 corp: 34/872b lim: 40 exec/s: 77 rss: 74Mb L: 10/40 MS: 1 ShuffleBytes- 00:08:21.179 [2024-10-27 21:33:22.899418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.899443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.899505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.899518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.899578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.899591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.179 [2024-10-27 21:33:22.899650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ff000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.179 [2024-10-27 21:33:22.899663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.438 #78 NEW cov: 12446 ft: 15600 corp: 35/908b lim: 40 exec/s: 78 rss: 74Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:21.438 [2024-10-27 21:33:22.959055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.438 [2024-10-27 21:33:22.959080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.438 [2024-10-27 21:33:22.959137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.438 [2024-10-27 21:33:22.959151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.438 #79 NEW cov: 12446 ft: 15616 corp: 36/924b lim: 40 exec/s: 79 rss: 74Mb L: 16/40 MS: 1 EraseBytes- 00:08:21.438 [2024-10-27 21:33:23.019541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0a29 cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.438 [2024-10-27 21:33:23.019566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.438 [2024-10-27 21:33:23.019638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:7fffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.438 [2024-10-27 21:33:23.019652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.439 [2024-10-27 21:33:23.019706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.439 [2024-10-27 21:33:23.019719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.439 [2024-10-27 21:33:23.019778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.439 [2024-10-27 21:33:23.019791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.439 [2024-10-27 21:33:23.019848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.439 [2024-10-27 21:33:23.019861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.439 #80 NEW cov: 12446 ft: 15626 corp: 37/964b lim: 40 exec/s: 40 rss: 74Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:21.439 #80 DONE cov: 12446 ft: 15626 corp: 37/964b lim: 40 exec/s: 40 rss: 74Mb 00:08:21.439 ###### Recommended dictionary. ###### 00:08:21.439 "\377\377" # Uses: 1 00:08:21.439 "\001\000" # Uses: 1 00:08:21.439 "\001\000\000\000\000\000\000\001" # Uses: 0 00:08:21.439 "\001\000\000\177" # Uses: 0 00:08:21.439 ###### End of recommended dictionary. ###### 00:08:21.439 Done 80 runs in 2 second(s) 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.439 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:21.697 21:33:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:21.697 [2024-10-27 21:33:23.206164] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:21.698 [2024-10-27 21:33:23.206231] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3326892 ] 00:08:21.956 [2024-10-27 21:33:23.518403] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:21.956 [2024-10-27 21:33:23.563598] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.956 [2024-10-27 21:33:23.585935] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.956 [2024-10-27 21:33:23.638441] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.956 [2024-10-27 21:33:23.654740] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:21.956 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.956 INFO: Seed: 2628063264 00:08:22.215 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:22.215 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:22.215 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:22.215 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.215 #2 INITED exec/s: 0 rss: 64Mb 00:08:22.215 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.215 This may also happen if the target rejected all inputs we tried so far 00:08:22.215 [2024-10-27 21:33:23.731968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7152c cdw11:d712ef7a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.215 [2024-10-27 21:33:23.732006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.474 NEW_FUNC[1/716]: 0x46f6f8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:22.474 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.474 #4 NEW cov: 12193 ft: 12192 corp: 2/10b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 2 ChangeBit-CMP- DE: "\327\025,\327\022\357z\000"- 00:08:22.474 [2024-10-27 21:33:24.081464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7152c cdw11:d7007aef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.474 [2024-10-27 21:33:24.081515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.474 [2024-10-27 21:33:24.081662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:130b9809 cdw11:6e12ef7a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.474 [2024-10-27 21:33:24.081688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.475 #5 NEW cov: 12329 ft: 13574 corp: 3/27b lim: 40 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 CMP- DE: "\000z\357\023\013\230\011n"- 00:08:22.475 [2024-10-27 21:33:24.160920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7152c cdw11:d712af7a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.475 [2024-10-27 21:33:24.160953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.475 #6 NEW cov: 12335 ft: 13936 corp: 4/36b lim: 40 exec/s: 0 rss: 72Mb L: 9/17 MS: 1 ChangeBit- 00:08:22.734 [2024-10-27 21:33:24.211022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a0e8ad7 cdw11:152cd700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.211050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #10 NEW cov: 12420 ft: 14278 corp: 5/51b lim: 40 exec/s: 0 rss: 72Mb L: 15/17 MS: 4 ChangeBit-CopyPart-CrossOver-CrossOver- 00:08:22.734 [2024-10-27 21:33:24.260927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a20d1c1 cdw11:2613ef7a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.260958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #11 NEW cov: 12420 ft: 14385 corp: 6/60b lim: 40 exec/s: 0 rss: 72Mb L: 9/17 MS: 1 CMP- DE: " \321\301&\023\357z\000"- 00:08:22.734 [2024-10-27 21:33:24.311780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.311811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 [2024-10-27 21:33:24.311931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.311953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.734 [2024-10-27 21:33:24.312083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.312099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.734 [2024-10-27 21:33:24.312225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.312242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.734 #14 NEW cov: 12420 ft: 14824 corp: 7/96b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:22.734 [2024-10-27 21:33:24.361026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7152c cdw11:d712d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.361055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #15 NEW cov: 12420 ft: 14908 corp: 8/105b lim: 40 exec/s: 0 rss: 72Mb L: 9/36 MS: 1 CopyPart- 00:08:22.734 [2024-10-27 21:33:24.411042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a8ad715 cdw11:2cd712ef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-10-27 21:33:24.411072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #16 NEW cov: 12420 ft: 14955 corp: 9/115b lim: 40 exec/s: 0 rss: 72Mb L: 10/36 MS: 1 CrossOver- 00:08:22.993 [2024-10-27 21:33:24.461122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad71524 cdw11:d712d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.461151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 #17 NEW cov: 12420 ft: 14996 corp: 10/124b lim: 40 exec/s: 0 rss: 72Mb L: 9/36 MS: 1 ChangeByte- 00:08:22.993 [2024-10-27 21:33:24.531959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.531986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.532127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.532145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.532213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.532224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.532240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:37d7152c cdw11:d712d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.532249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.993 #18 NEW cov: 12429 ft: 15122 corp: 11/157b lim: 40 exec/s: 0 rss: 72Mb L: 33/36 MS: 1 InsertRepeatedBytes- 00:08:22.993 [2024-10-27 21:33:24.582087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.582114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.582238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.582254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.582375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.582394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.582517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:37d7262c cdw11:d712d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.582533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.993 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:22.993 #24 NEW cov: 12452 ft: 15165 corp: 12/190b lim: 40 exec/s: 0 rss: 72Mb L: 33/36 MS: 1 ChangeByte- 00:08:22.993 [2024-10-27 21:33:24.652136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.652162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.652294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.652310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.652424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:37373700 cdw11:7aef130b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.652442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.993 [2024-10-27 21:33:24.652568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:98096e2c cdw11:d712d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.652584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.993 #25 NEW cov: 12452 ft: 15198 corp: 13/223b lim: 40 exec/s: 0 rss: 73Mb L: 33/36 MS: 1 PersAutoDict- DE: "\000z\357\023\013\230\011n"- 00:08:22.993 [2024-10-27 21:33:24.701286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7d712 cdw11:ef7a5dd7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-10-27 21:33:24.701313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.252 #28 NEW cov: 12452 ft: 15214 corp: 14/238b lim: 40 exec/s: 28 rss: 73Mb L: 15/36 MS: 3 EraseBytes-ChangeByte-PersAutoDict- DE: "\327\025,\327\022\357z\000"- 00:08:23.252 [2024-10-27 21:33:24.751595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7152c cdw11:d7007aef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.252 [2024-10-27 21:33:24.751622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.252 [2024-10-27 21:33:24.751745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:130b9800 cdw11:096e12ef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.252 [2024-10-27 21:33:24.751763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.252 #29 NEW cov: 12452 ft: 15254 corp: 15/256b lim: 40 exec/s: 29 rss: 73Mb L: 18/36 MS: 1 InsertByte- 00:08:23.252 [2024-10-27 21:33:24.822271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.252 [2024-10-27 21:33:24.822298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.252 [2024-10-27 21:33:24.822428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.252 [2024-10-27 21:33:24.822444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.252 [2024-10-27 21:33:24.822567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:3737370a cdw11:8ad7152c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.252 [2024-10-27 21:33:24.822585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.252 [2024-10-27 21:33:24.822712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:d712ef7a cdw11:0012d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.252 [2024-10-27 21:33:24.822728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.253 #30 NEW cov: 12452 ft: 15293 corp: 16/289b lim: 40 exec/s: 30 rss: 73Mb L: 33/36 MS: 1 CrossOver- 00:08:23.253 [2024-10-27 21:33:24.892630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.892659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.892779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.892797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.892927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.892950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.893077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.893094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.893225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:ffffd7fd cdw11:fdfdfd15 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.893241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.253 #31 NEW cov: 12452 ft: 15367 corp: 17/329b lim: 40 exec/s: 31 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:23.253 [2024-10-27 21:33:24.962384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.962413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.962549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.962567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.962695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.962713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.253 [2024-10-27 21:33:24.962843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-10-27 21:33:24.962860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.512 #32 NEW cov: 12452 ft: 15369 corp: 18/365b lim: 40 exec/s: 32 rss: 73Mb L: 36/40 MS: 1 ChangeBinInt- 00:08:23.512 [2024-10-27 21:33:25.012390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.012422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.012547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dfffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.012566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.012702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.012721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.012841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.012859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.512 #33 NEW cov: 12452 ft: 15408 corp: 19/401b lim: 40 exec/s: 33 rss: 73Mb L: 36/40 MS: 1 ChangeBit- 00:08:23.512 [2024-10-27 21:33:25.062333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.062361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.062494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.062512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.062644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:24373700 cdw11:7aef130b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.062661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.062797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:98096e2c cdw11:d712d77a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.062814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.512 #34 NEW cov: 12452 ft: 15440 corp: 20/434b lim: 40 exec/s: 34 rss: 73Mb L: 33/40 MS: 1 ChangeByte- 00:08:23.512 [2024-10-27 21:33:25.111493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7152c cdw11:d712ef40 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.111520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 #35 NEW cov: 12452 ft: 15460 corp: 21/444b lim: 40 exec/s: 35 rss: 73Mb L: 10/40 MS: 1 InsertByte- 00:08:23.512 [2024-10-27 21:33:25.161929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a8ad715 cdw11:2cd712ef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.161965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.162088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:7ad7152c cdw11:d712ef7a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.162104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.512 #36 NEW cov: 12452 ft: 15478 corp: 22/462b lim: 40 exec/s: 36 rss: 73Mb L: 18/40 MS: 1 CopyPart- 00:08:23.512 [2024-10-27 21:33:25.231960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a8ad715 cdw11:2cd712ef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.231991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 [2024-10-27 21:33:25.232133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:7ad7152c cdw11:d712ef7a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-10-27 21:33:25.232151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.771 #37 NEW cov: 12452 ft: 15498 corp: 23/480b lim: 40 exec/s: 37 rss: 73Mb L: 18/40 MS: 1 CopyPart- 00:08:23.771 [2024-10-27 21:33:25.301968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a0e8ad7 cdw11:152cd700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.771 [2024-10-27 21:33:25.301998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.771 [2024-10-27 21:33:25.302123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:7aef007a cdw11:ef130b98 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.771 [2024-10-27 21:33:25.302142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.772 #38 NEW cov: 12452 ft: 15504 corp: 24/503b lim: 40 exec/s: 38 rss: 73Mb L: 23/40 MS: 1 PersAutoDict- DE: "\000z\357\023\013\230\011n"- 00:08:23.772 [2024-10-27 21:33:25.372570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8a373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.372600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 [2024-10-27 21:33:25.372741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:37373737 cdw11:37373737 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.372759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.772 [2024-10-27 21:33:25.372886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:37373700 cdw11:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.372905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.772 [2024-10-27 21:33:25.373027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ef130b98 cdw11:096e2cd7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.373048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.772 #39 NEW cov: 12452 ft: 15507 corp: 25/539b lim: 40 exec/s: 39 rss: 73Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:23.772 [2024-10-27 21:33:25.422630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.422659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 [2024-10-27 21:33:25.422797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dfffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.422814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.772 [2024-10-27 21:33:25.422946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.422963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.772 [2024-10-27 21:33:25.423091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.423109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.772 #40 NEW cov: 12452 ft: 15512 corp: 26/575b lim: 40 exec/s: 40 rss: 73Mb L: 36/40 MS: 1 ShuffleBytes- 00:08:23.772 [2024-10-27 21:33:25.491791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-10-27 21:33:25.491819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.031 #41 NEW cov: 12452 ft: 15563 corp: 27/584b lim: 40 exec/s: 41 rss: 73Mb L: 9/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:24.031 [2024-10-27 21:33:25.541869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7d712 cdw11:ef7a5dd7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.031 [2024-10-27 21:33:25.541897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.031 #42 NEW cov: 12452 ft: 15611 corp: 28/599b lim: 40 exec/s: 42 rss: 73Mb L: 15/40 MS: 1 ChangeByte- 00:08:24.031 [2024-10-27 21:33:25.611923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8ad7d712 cdw11:5b7a5dd7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.031 [2024-10-27 21:33:25.611955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.031 #43 NEW cov: 12452 ft: 15633 corp: 29/614b lim: 40 exec/s: 43 rss: 73Mb L: 15/40 MS: 1 ChangeByte- 00:08:24.031 [2024-10-27 21:33:25.681948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a0e8ad7 cdw11:152cd700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.031 [2024-10-27 21:33:25.681977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.031 #44 NEW cov: 12452 ft: 15669 corp: 30/629b lim: 40 exec/s: 22 rss: 73Mb L: 15/40 MS: 1 ChangeBinInt- 00:08:24.031 #44 DONE cov: 12452 ft: 15669 corp: 30/629b lim: 40 exec/s: 22 rss: 73Mb 00:08:24.031 ###### Recommended dictionary. ###### 00:08:24.031 "\327\025,\327\022\357z\000" # Uses: 1 00:08:24.031 "\000z\357\023\013\230\011n" # Uses: 2 00:08:24.031 " \321\301&\023\357z\000" # Uses: 0 00:08:24.031 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:24.031 ###### End of recommended dictionary. ###### 00:08:24.031 Done 44 runs in 2 second(s) 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:24.290 21:33:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:24.290 [2024-10-27 21:33:25.853136] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:24.290 [2024-10-27 21:33:25.853224] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3327421 ] 00:08:24.549 [2024-10-27 21:33:26.181617] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:24.549 [2024-10-27 21:33:26.229827] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.549 [2024-10-27 21:33:26.248740] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.809 [2024-10-27 21:33:26.301627] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.809 [2024-10-27 21:33:26.317955] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:24.809 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.809 INFO: Seed: 996086473 00:08:24.809 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:24.809 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:24.809 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.809 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.809 #2 INITED exec/s: 0 rss: 65Mb 00:08:24.809 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.809 This may also happen if the target rejected all inputs we tried so far 00:08:24.809 [2024-10-27 21:33:26.373657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.809 [2024-10-27 21:33:26.373685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.809 [2024-10-27 21:33:26.373751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.809 [2024-10-27 21:33:26.373765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.809 [2024-10-27 21:33:26.373823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.809 [2024-10-27 21:33:26.373837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.809 [2024-10-27 21:33:26.373898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.809 [2024-10-27 21:33:26.373911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.069 NEW_FUNC[1/715]: 0x4712c8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:25.069 NEW_FUNC[2/715]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.069 #5 NEW cov: 12205 ft: 12199 corp: 2/38b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:25.069 [2024-10-27 21:33:26.693327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.693359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.069 [2024-10-27 21:33:26.693414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.693428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.069 #14 NEW cov: 12318 ft: 13387 corp: 3/61b lim: 40 exec/s: 0 rss: 72Mb L: 23/37 MS: 4 ChangeBinInt-CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:08:25.069 [2024-10-27 21:33:26.733490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.733516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.069 [2024-10-27 21:33:26.733572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.733586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.069 [2024-10-27 21:33:26.733641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.733655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.069 [2024-10-27 21:33:26.733709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.733723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.069 #15 NEW cov: 12324 ft: 13514 corp: 4/98b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:25.069 [2024-10-27 21:33:26.793288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.793314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.069 [2024-10-27 21:33:26.793374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.069 [2024-10-27 21:33:26.793388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.329 #16 NEW cov: 12409 ft: 13805 corp: 5/121b lim: 40 exec/s: 0 rss: 72Mb L: 23/37 MS: 1 CrossOver- 00:08:25.329 [2024-10-27 21:33:26.833491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.833516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:26.833572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.833586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:26.833640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00250000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.833654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:26.833709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.833721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.329 #17 NEW cov: 12409 ft: 13922 corp: 6/158b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 ChangeBinInt- 00:08:25.329 [2024-10-27 21:33:26.893457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.893482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:26.893555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fe8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.893569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:26.893626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.893640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.329 #18 NEW cov: 12409 ft: 14129 corp: 7/184b lim: 40 exec/s: 0 rss: 73Mb L: 26/37 MS: 1 CrossOver- 00:08:25.329 [2024-10-27 21:33:26.953341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.953365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:26.953437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:26.953451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.329 #19 NEW cov: 12409 ft: 14182 corp: 8/207b lim: 40 exec/s: 0 rss: 73Mb L: 23/37 MS: 1 CopyPart- 00:08:25.329 [2024-10-27 21:33:27.013633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:27.013661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:27.013735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:27.013748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:27.013805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:27.013818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.329 [2024-10-27 21:33:27.013874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:27.013887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.329 #24 NEW cov: 12409 ft: 14304 corp: 9/242b lim: 40 exec/s: 0 rss: 73Mb L: 35/37 MS: 5 CopyPart-ChangeByte-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:08:25.329 [2024-10-27 21:33:27.053272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.329 [2024-10-27 21:33:27.053297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.589 #25 NEW cov: 12409 ft: 14606 corp: 10/254b lim: 40 exec/s: 0 rss: 73Mb L: 12/37 MS: 1 CrossOver- 00:08:25.589 [2024-10-27 21:33:27.093638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e2e2e23 cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.093663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.093719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.093732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.093788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.093801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.093872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.093886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.589 #26 NEW cov: 12409 ft: 14646 corp: 11/290b lim: 40 exec/s: 0 rss: 73Mb L: 36/37 MS: 1 InsertByte- 00:08:25.589 [2024-10-27 21:33:27.153398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.153423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.153497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.153511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.589 #27 NEW cov: 12409 ft: 14685 corp: 12/313b lim: 40 exec/s: 0 rss: 73Mb L: 23/37 MS: 1 ChangeBinInt- 00:08:25.589 [2024-10-27 21:33:27.213634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.213660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.213731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.213745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.213800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.213813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.213869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.213882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.589 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:25.589 #28 NEW cov: 12432 ft: 14755 corp: 13/350b lim: 40 exec/s: 0 rss: 73Mb L: 37/37 MS: 1 CopyPart- 00:08:25.589 [2024-10-27 21:33:27.273610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.273635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.273710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.273724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.273782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.273795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.589 #29 NEW cov: 12432 ft: 14789 corp: 14/375b lim: 40 exec/s: 0 rss: 73Mb L: 25/37 MS: 1 EraseBytes- 00:08:25.589 [2024-10-27 21:33:27.313484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.313509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.589 [2024-10-27 21:33:27.313584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.589 [2024-10-27 21:33:27.313598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.849 #35 NEW cov: 12432 ft: 14847 corp: 15/395b lim: 40 exec/s: 35 rss: 73Mb L: 20/37 MS: 1 EraseBytes- 00:08:25.849 [2024-10-27 21:33:27.373469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-10-27 21:33:27.373494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 [2024-10-27 21:33:27.373569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.373584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.850 #36 NEW cov: 12432 ft: 14854 corp: 16/418b lim: 40 exec/s: 36 rss: 73Mb L: 23/37 MS: 1 CMP- DE: "\000\000\002\000"- 00:08:25.850 [2024-10-27 21:33:27.413748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.413773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.413833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.413846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.413905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.413918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.413992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00008000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.414006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.850 #37 NEW cov: 12432 ft: 14896 corp: 17/455b lim: 40 exec/s: 37 rss: 73Mb L: 37/37 MS: 1 ChangeBit- 00:08:25.850 [2024-10-27 21:33:27.453765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:31000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.453790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.453862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.453876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.453935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.453954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.454010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.454023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.850 #38 NEW cov: 12432 ft: 14907 corp: 18/493b lim: 40 exec/s: 38 rss: 73Mb L: 38/38 MS: 1 InsertByte- 00:08:25.850 [2024-10-27 21:33:27.513810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.513834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.513908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.513923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.513982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.513996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.514052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00008000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.514065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.850 #39 NEW cov: 12432 ft: 14926 corp: 19/531b lim: 40 exec/s: 39 rss: 73Mb L: 38/38 MS: 1 InsertByte- 00:08:25.850 [2024-10-27 21:33:27.573596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.573621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.850 [2024-10-27 21:33:27.573676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.850 [2024-10-27 21:33:27.573690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.110 #40 NEW cov: 12432 ft: 14945 corp: 20/554b lim: 40 exec/s: 40 rss: 74Mb L: 23/38 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:26.110 [2024-10-27 21:33:27.613831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.613856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.613914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.613927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.614000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00250000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.614014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.614069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.614082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.110 #41 NEW cov: 12432 ft: 14952 corp: 21/591b lim: 40 exec/s: 41 rss: 74Mb L: 37/38 MS: 1 ChangeBinInt- 00:08:26.110 [2024-10-27 21:33:27.653857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e2e2e23 cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.653882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.653958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.653972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.654040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.654056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.654113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.654126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.110 #47 NEW cov: 12432 ft: 15012 corp: 22/627b lim: 40 exec/s: 47 rss: 74Mb L: 36/38 MS: 1 ShuffleBytes- 00:08:26.110 [2024-10-27 21:33:27.713530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.713554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.110 #48 NEW cov: 12432 ft: 15027 corp: 23/639b lim: 40 exec/s: 48 rss: 74Mb L: 12/38 MS: 1 ChangeByte- 00:08:26.110 [2024-10-27 21:33:27.773675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.773700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.773770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00008b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.773783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.110 #49 NEW cov: 12432 ft: 15028 corp: 24/662b lim: 40 exec/s: 49 rss: 74Mb L: 23/38 MS: 1 CrossOver- 00:08:26.110 [2024-10-27 21:33:27.813679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.813702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.110 [2024-10-27 21:33:27.813773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.110 [2024-10-27 21:33:27.813787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.370 #50 NEW cov: 12432 ft: 15054 corp: 25/685b lim: 40 exec/s: 50 rss: 74Mb L: 23/38 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:26.370 [2024-10-27 21:33:27.873711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe000002 cdw11:008b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.370 [2024-10-27 21:33:27.873736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.370 [2024-10-27 21:33:27.873791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.370 [2024-10-27 21:33:27.873805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.370 #51 NEW cov: 12432 ft: 15125 corp: 26/708b lim: 40 exec/s: 51 rss: 74Mb L: 23/38 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:26.370 [2024-10-27 21:33:27.914081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e2e2e23 cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.370 [2024-10-27 21:33:27.914105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.370 [2024-10-27 21:33:27.914166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.370 [2024-10-27 21:33:27.914183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.370 [2024-10-27 21:33:27.914241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.370 [2024-10-27 21:33:27.914254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.370 [2024-10-27 21:33:27.914310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.914323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:27.914378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e53 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.914390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:26.371 #52 NEW cov: 12432 ft: 15170 corp: 27/748b lim: 40 exec/s: 52 rss: 74Mb L: 40/40 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:26.371 [2024-10-27 21:33:27.954100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2e2e2e23 cdw11:2e2e2e2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.954125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:27.954196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.954210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:27.954267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2e008b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.954280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:27.954337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.954350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:27.954408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:8b8b8b2e cdw11:2e2e2e53 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:27.954421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:26.371 #53 NEW cov: 12432 ft: 15184 corp: 28/788b lim: 40 exec/s: 53 rss: 74Mb L: 40/40 MS: 1 CrossOver- 00:08:26.371 [2024-10-27 21:33:28.013768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:28.013792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:28.013865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:28.013879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.371 #54 NEW cov: 12432 ft: 15190 corp: 29/805b lim: 40 exec/s: 54 rss: 74Mb L: 17/40 MS: 1 EraseBytes- 00:08:26.371 [2024-10-27 21:33:28.073853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:28.073880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.371 [2024-10-27 21:33:28.073961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.371 [2024-10-27 21:33:28.073976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.629 #55 NEW cov: 12432 ft: 15210 corp: 30/827b lim: 40 exec/s: 55 rss: 74Mb L: 22/40 MS: 1 CMP- DE: "\001\000"- 00:08:26.629 [2024-10-27 21:33:28.134072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.134097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.134168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.134181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.134237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.134250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.134306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.134318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.629 #56 NEW cov: 12432 ft: 15223 corp: 31/865b lim: 40 exec/s: 56 rss: 74Mb L: 38/40 MS: 1 InsertByte- 00:08:26.629 [2024-10-27 21:33:28.173864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe8b8b8b cdw11:8b948b01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.173889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.173965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:008b8b8b cdw11:8b8b8b94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.173978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.629 #57 NEW cov: 12432 ft: 15232 corp: 32/887b lim: 40 exec/s: 57 rss: 74Mb L: 22/40 MS: 1 CopyPart- 00:08:26.629 [2024-10-27 21:33:28.234066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.234092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.234150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.234163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.234219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.234232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.629 #58 NEW cov: 12432 ft: 15270 corp: 33/911b lim: 40 exec/s: 58 rss: 74Mb L: 24/40 MS: 1 InsertByte- 00:08:26.629 [2024-10-27 21:33:28.274047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe000002 cdw11:008b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.274073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.274128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.274141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.274199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.274213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.629 #59 NEW cov: 12432 ft: 15272 corp: 34/935b lim: 40 exec/s: 59 rss: 74Mb L: 24/40 MS: 1 InsertByte- 00:08:26.629 [2024-10-27 21:33:28.334104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fe000002 cdw11:008b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.334129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.334187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.334201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.629 [2024-10-27 21:33:28.334271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.629 [2024-10-27 21:33:28.334285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.888 #60 NEW cov: 12432 ft: 15284 corp: 35/959b lim: 40 exec/s: 30 rss: 74Mb L: 24/40 MS: 1 CopyPart- 00:08:26.888 #60 DONE cov: 12432 ft: 15284 corp: 35/959b lim: 40 exec/s: 30 rss: 74Mb 00:08:26.888 ###### Recommended dictionary. ###### 00:08:26.888 "\000\000\002\000" # Uses: 4 00:08:26.888 "\001\000" # Uses: 0 00:08:26.888 ###### End of recommended dictionary. ###### 00:08:26.888 Done 60 runs in 2 second(s) 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:26.888 21:33:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:26.888 [2024-10-27 21:33:28.509826] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:26.888 [2024-10-27 21:33:28.509881] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3327833 ] 00:08:27.147 [2024-10-27 21:33:28.749692] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:27.147 [2024-10-27 21:33:28.795398] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.147 [2024-10-27 21:33:28.808605] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.147 [2024-10-27 21:33:28.860950] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:27.406 [2024-10-27 21:33:28.877276] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:27.406 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.406 INFO: Seed: 3556089344 00:08:27.406 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:27.406 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:27.406 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:27.406 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.406 #2 INITED exec/s: 0 rss: 67Mb 00:08:27.406 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.406 This may also happen if the target rejected all inputs we tried so far 00:08:27.406 [2024-10-27 21:33:28.953549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.406 [2024-10-27 21:33:28.953584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.406 [2024-10-27 21:33:28.953719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.406 [2024-10-27 21:33:28.953735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.666 NEW_FUNC[1/716]: 0x472e98 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:27.666 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.666 #14 NEW cov: 12199 ft: 12197 corp: 2/15b lim: 35 exec/s: 0 rss: 73Mb L: 14/14 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:27.666 [2024-10-27 21:33:29.283673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.666 [2024-10-27 21:33:29.283713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.666 [2024-10-27 21:33:29.283835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.666 [2024-10-27 21:33:29.283854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.666 #15 NEW cov: 12312 ft: 12896 corp: 3/29b lim: 35 exec/s: 0 rss: 73Mb L: 14/14 MS: 1 ChangeByte- 00:08:27.666 [2024-10-27 21:33:29.343655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.666 [2024-10-27 21:33:29.343683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.666 [2024-10-27 21:33:29.343811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.666 [2024-10-27 21:33:29.343828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.666 #18 NEW cov: 12318 ft: 13134 corp: 4/48b lim: 35 exec/s: 0 rss: 73Mb L: 19/19 MS: 3 ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:08:27.666 [2024-10-27 21:33:29.383808] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.666 [2024-10-27 21:33:29.383836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.925 NEW_FUNC[1/2]: 0x4943e8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:27.925 NEW_FUNC[2/2]: 0x1397e48 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1766 00:08:27.925 #20 NEW cov: 12436 ft: 13626 corp: 5/68b lim: 35 exec/s: 0 rss: 73Mb L: 20/20 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:27.925 [2024-10-27 21:33:29.423435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000077 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.423461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 #25 NEW cov: 12436 ft: 14265 corp: 6/81b lim: 35 exec/s: 0 rss: 73Mb L: 13/20 MS: 5 CrossOver-CrossOver-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:27.925 [2024-10-27 21:33:29.493719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.493747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 [2024-10-27 21:33:29.493872] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.493888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.925 #26 NEW cov: 12436 ft: 14344 corp: 7/95b lim: 35 exec/s: 0 rss: 74Mb L: 14/20 MS: 1 ChangeBinInt- 00:08:27.925 [2024-10-27 21:33:29.564425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.564453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.925 [2024-10-27 21:33:29.564580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.564598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.925 [2024-10-27 21:33:29.564725] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.564742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.925 #27 NEW cov: 12436 ft: 14667 corp: 8/129b lim: 35 exec/s: 0 rss: 74Mb L: 34/34 MS: 1 CopyPart- 00:08:27.925 [2024-10-27 21:33:29.634703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.634738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.925 [2024-10-27 21:33:29.634862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-10-27 21:33:29.634884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.925 [2024-10-27 21:33:29.635011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.926 [2024-10-27 21:33:29.635028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.926 [2024-10-27 21:33:29.635153] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.926 [2024-10-27 21:33:29.635170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:28.184 #28 NEW cov: 12443 ft: 14977 corp: 9/164b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:28.184 [2024-10-27 21:33:29.683850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.683878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 [2024-10-27 21:33:29.684003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.684026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.184 #29 NEW cov: 12443 ft: 15028 corp: 10/183b lim: 35 exec/s: 0 rss: 74Mb L: 19/35 MS: 1 InsertRepeatedBytes- 00:08:28.184 [2024-10-27 21:33:29.723903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.723930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.184 #30 NEW cov: 12443 ft: 15068 corp: 11/203b lim: 35 exec/s: 0 rss: 74Mb L: 20/35 MS: 1 ChangeBit- 00:08:28.184 [2024-10-27 21:33:29.774606] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.774638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.184 [2024-10-27 21:33:29.774764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.774783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.184 [2024-10-27 21:33:29.774899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.774916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.184 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:28.184 #31 NEW cov: 12466 ft: 15098 corp: 12/237b lim: 35 exec/s: 0 rss: 74Mb L: 34/35 MS: 1 EraseBytes- 00:08:28.184 [2024-10-27 21:33:29.843955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.843986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 [2024-10-27 21:33:29.844115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.844147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.184 #32 NEW cov: 12466 ft: 15127 corp: 13/251b lim: 35 exec/s: 0 rss: 74Mb L: 14/35 MS: 1 CrossOver- 00:08:28.184 [2024-10-27 21:33:29.894049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.894078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 [2024-10-27 21:33:29.894205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-10-27 21:33:29.894223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 #33 NEW cov: 12466 ft: 15206 corp: 14/265b lim: 35 exec/s: 0 rss: 74Mb L: 14/35 MS: 1 ChangeByte- 00:08:28.443 [2024-10-27 21:33:29.934026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:29.934054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 #34 NEW cov: 12466 ft: 15271 corp: 15/285b lim: 35 exec/s: 34 rss: 74Mb L: 20/35 MS: 1 ChangeBit- 00:08:28.443 [2024-10-27 21:33:29.984017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:29.984044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:29.984172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:29.984190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 #35 NEW cov: 12466 ft: 15324 corp: 16/300b lim: 35 exec/s: 35 rss: 74Mb L: 15/35 MS: 1 InsertByte- 00:08:28.443 [2024-10-27 21:33:30.054801] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.054838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.054960] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.054979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.055109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.055126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.443 #36 NEW cov: 12466 ft: 15392 corp: 17/334b lim: 35 exec/s: 36 rss: 74Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:28.443 [2024-10-27 21:33:30.114973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.115009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.115132] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.115154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.115280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.115302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.115448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.115469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:28.443 #37 NEW cov: 12466 ft: 15471 corp: 18/369b lim: 35 exec/s: 37 rss: 74Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:28.443 [2024-10-27 21:33:30.154241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.154275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.154398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.154417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 [2024-10-27 21:33:30.154547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.443 [2024-10-27 21:33:30.154564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.702 #38 NEW cov: 12466 ft: 15604 corp: 19/390b lim: 35 exec/s: 38 rss: 74Mb L: 21/35 MS: 1 InsertRepeatedBytes- 00:08:28.702 [2024-10-27 21:33:30.224220] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.224249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.702 [2024-10-27 21:33:30.224371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.224389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.702 #39 NEW cov: 12466 ft: 15646 corp: 20/404b lim: 35 exec/s: 39 rss: 74Mb L: 14/35 MS: 1 ChangeBinInt- 00:08:28.702 [2024-10-27 21:33:30.284253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.284282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.702 [2024-10-27 21:33:30.284409] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.284432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.702 #40 NEW cov: 12466 ft: 15667 corp: 21/423b lim: 35 exec/s: 40 rss: 74Mb L: 19/35 MS: 1 ChangeBit- 00:08:28.702 [2024-10-27 21:33:30.354938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.354969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.702 [2024-10-27 21:33:30.355085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.355103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.702 [2024-10-27 21:33:30.355221] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.355239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.702 #41 NEW cov: 12466 ft: 15694 corp: 22/457b lim: 35 exec/s: 41 rss: 74Mb L: 34/35 MS: 1 ChangeByte- 00:08:28.702 [2024-10-27 21:33:30.413972] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000077 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.702 [2024-10-27 21:33:30.413999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.962 #42 NEW cov: 12466 ft: 15704 corp: 23/470b lim: 35 exec/s: 42 rss: 75Mb L: 13/35 MS: 1 ChangeBit- 00:08:28.962 [2024-10-27 21:33:30.483992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000077 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.962 [2024-10-27 21:33:30.484021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.962 #43 NEW cov: 12466 ft: 15715 corp: 24/483b lim: 35 exec/s: 43 rss: 75Mb L: 13/35 MS: 1 ChangeBit- 00:08:28.962 #44 NEW cov: 12466 ft: 15722 corp: 25/491b lim: 35 exec/s: 44 rss: 75Mb L: 8/35 MS: 1 CrossOver- 00:08:28.962 [2024-10-27 21:33:30.594346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.962 [2024-10-27 21:33:30.594373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.962 [2024-10-27 21:33:30.594503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.962 [2024-10-27 21:33:30.594526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.962 #45 NEW cov: 12466 ft: 15727 corp: 26/510b lim: 35 exec/s: 45 rss: 75Mb L: 19/35 MS: 1 ChangeByte- 00:08:28.962 [2024-10-27 21:33:30.644982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.962 [2024-10-27 21:33:30.645012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.962 [2024-10-27 21:33:30.645136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000041 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.962 [2024-10-27 21:33:30.645153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.962 [2024-10-27 21:33:30.645282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.962 [2024-10-27 21:33:30.645301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.962 #46 NEW cov: 12466 ft: 15746 corp: 27/544b lim: 35 exec/s: 46 rss: 75Mb L: 34/35 MS: 1 ChangeByte- 00:08:29.221 [2024-10-27 21:33:30.695328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.695355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.695482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.695508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.695632] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.695665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.695794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.695811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.221 #47 NEW cov: 12466 ft: 15769 corp: 28/579b lim: 35 exec/s: 47 rss: 75Mb L: 35/35 MS: 1 ChangeByte- 00:08:29.221 [2024-10-27 21:33:30.734520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000046 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.734550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.795130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000046 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.795164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.795303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000077 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.795321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.795457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.795475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.221 #49 NEW cov: 12466 ft: 15808 corp: 29/610b lim: 35 exec/s: 49 rss: 75Mb L: 31/35 MS: 2 CMP-CrossOver- DE: "\303!F\352\026\357z\000"- 00:08:29.221 [2024-10-27 21:33:30.844558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000046 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.844589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.221 #50 NEW cov: 12466 ft: 15839 corp: 30/630b lim: 35 exec/s: 50 rss: 75Mb L: 20/35 MS: 1 PersAutoDict- DE: "\303!F\352\026\357z\000"- 00:08:29.221 [2024-10-27 21:33:30.894827] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.894853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.894986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000046 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.895008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.221 [2024-10-27 21:33:30.895130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000034 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.895147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.221 #51 NEW cov: 12466 ft: 15841 corp: 31/652b lim: 35 exec/s: 51 rss: 75Mb L: 22/35 MS: 1 PersAutoDict- DE: "\303!F\352\026\357z\000"- 00:08:29.221 [2024-10-27 21:33:30.944403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.221 [2024-10-27 21:33:30.944431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.481 #52 NEW cov: 12466 ft: 15849 corp: 32/660b lim: 35 exec/s: 26 rss: 75Mb L: 8/35 MS: 1 EraseBytes- 00:08:29.481 #52 DONE cov: 12466 ft: 15849 corp: 32/660b lim: 35 exec/s: 26 rss: 75Mb 00:08:29.481 ###### Recommended dictionary. ###### 00:08:29.481 "\303!F\352\026\357z\000" # Uses: 2 00:08:29.481 ###### End of recommended dictionary. ###### 00:08:29.481 Done 52 runs in 2 second(s) 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:29.481 21:33:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:29.481 [2024-10-27 21:33:31.115264] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:29.481 [2024-10-27 21:33:31.115330] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3328249 ] 00:08:29.740 [2024-10-27 21:33:31.434162] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:29.999 [2024-10-27 21:33:31.482252] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.999 [2024-10-27 21:33:31.501326] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.999 [2024-10-27 21:33:31.554033] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.999 [2024-10-27 21:33:31.570339] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:29.999 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.999 INFO: Seed: 1954126833 00:08:29.999 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:29.999 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:29.999 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.999 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.999 #2 INITED exec/s: 0 rss: 64Mb 00:08:29.999 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.999 This may also happen if the target rejected all inputs we tried so far 00:08:30.259 NEW_FUNC[1/702]: 0x4743d8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:30.259 NEW_FUNC[2/702]: 0x4943e8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:30.259 #8 NEW cov: 12069 ft: 12066 corp: 2/10b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:30.259 [2024-10-27 21:33:31.966603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.259 [2024-10-27 21:33:31.966659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.518 NEW_FUNC[1/14]: 0x194e8c8 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:30.519 NEW_FUNC[2/14]: 0x194eb08 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:30.519 #9 NEW cov: 12314 ft: 13049 corp: 3/19b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:08:30.519 [2024-10-27 21:33:32.036539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.519 [2024-10-27 21:33:32.036570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.519 #10 NEW cov: 12320 ft: 13345 corp: 4/28b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:08:30.519 [2024-10-27 21:33:32.106521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.519 [2024-10-27 21:33:32.106550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.519 #11 NEW cov: 12405 ft: 13558 corp: 5/37b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:08:30.519 [2024-10-27 21:33:32.156589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000018a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.519 [2024-10-27 21:33:32.156617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.519 #12 NEW cov: 12405 ft: 13677 corp: 6/46b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:08:30.519 [2024-10-27 21:33:32.226680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.519 [2024-10-27 21:33:32.226709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.778 #13 NEW cov: 12405 ft: 13762 corp: 7/55b lim: 35 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:08:30.778 [2024-10-27 21:33:32.277445] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.277475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.778 [2024-10-27 21:33:32.277625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.277645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.778 [2024-10-27 21:33:32.277791] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.277811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.778 [2024-10-27 21:33:32.277954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.277975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.778 #14 NEW cov: 12405 ft: 14464 corp: 8/84b lim: 35 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:30.778 [2024-10-27 21:33:32.347506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.347534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.778 [2024-10-27 21:33:32.347669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.347691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.778 [2024-10-27 21:33:32.347830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.347849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.778 [2024-10-27 21:33:32.347996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000025d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.348015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.778 #15 NEW cov: 12405 ft: 14522 corp: 9/112b lim: 35 exec/s: 0 rss: 72Mb L: 28/29 MS: 1 InsertRepeatedBytes- 00:08:30.778 [2024-10-27 21:33:32.396767] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000018a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.396797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.778 #16 NEW cov: 12405 ft: 14553 corp: 10/121b lim: 35 exec/s: 0 rss: 73Mb L: 9/29 MS: 1 CopyPart- 00:08:30.778 [2024-10-27 21:33:32.466748] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.778 [2024-10-27 21:33:32.466776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.778 #17 NEW cov: 12405 ft: 14604 corp: 11/130b lim: 35 exec/s: 0 rss: 73Mb L: 9/29 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.037 [2024-10-27 21:33:32.516925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.038 [2024-10-27 21:33:32.516959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.038 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:31.038 #18 NEW cov: 12428 ft: 14652 corp: 12/139b lim: 35 exec/s: 0 rss: 73Mb L: 9/29 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:31.038 [2024-10-27 21:33:32.566880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.038 [2024-10-27 21:33:32.566906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.038 #19 NEW cov: 12428 ft: 14703 corp: 13/149b lim: 35 exec/s: 19 rss: 73Mb L: 10/29 MS: 1 InsertByte- 00:08:31.038 [2024-10-27 21:33:32.636945] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.038 [2024-10-27 21:33:32.636974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.038 #20 NEW cov: 12428 ft: 14718 corp: 14/160b lim: 35 exec/s: 20 rss: 73Mb L: 11/29 MS: 1 InsertByte- 00:08:31.038 [2024-10-27 21:33:32.706987] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.038 [2024-10-27 21:33:32.707016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.038 #21 NEW cov: 12428 ft: 14748 corp: 15/169b lim: 35 exec/s: 21 rss: 73Mb L: 9/29 MS: 1 ShuffleBytes- 00:08:31.038 [2024-10-27 21:33:32.756982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.038 [2024-10-27 21:33:32.757012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.297 #27 NEW cov: 12428 ft: 14779 corp: 16/180b lim: 35 exec/s: 27 rss: 73Mb L: 11/29 MS: 1 CMP- DE: "\001\000"- 00:08:31.297 [2024-10-27 21:33:32.807888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.807917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.808070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.808092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.808226] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.808245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.808388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.808407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.297 #28 NEW cov: 12428 ft: 14796 corp: 17/209b lim: 35 exec/s: 28 rss: 73Mb L: 29/29 MS: 1 ChangeBit- 00:08:31.297 [2024-10-27 21:33:32.877965] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.877996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.878155] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.878175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.878327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.878344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.878494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.878517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.297 #29 NEW cov: 12428 ft: 14896 corp: 18/242b lim: 35 exec/s: 29 rss: 73Mb L: 33/33 MS: 1 CrossOver- 00:08:31.297 [2024-10-27 21:33:32.927981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.928011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.928160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.928179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.928326] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000025d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.928346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.297 [2024-10-27 21:33:32.928497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000025d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.928516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.297 #30 NEW cov: 12428 ft: 14919 corp: 19/271b lim: 35 exec/s: 30 rss: 73Mb L: 29/33 MS: 1 InsertByte- 00:08:31.297 [2024-10-27 21:33:32.997212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.297 [2024-10-27 21:33:32.997243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.297 #31 NEW cov: 12428 ft: 14982 corp: 20/280b lim: 35 exec/s: 31 rss: 73Mb L: 9/33 MS: 1 CMP- DE: "\000\000\000\034"- 00:08:31.556 [2024-10-27 21:33:33.047539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.556 [2024-10-27 21:33:33.047568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.556 [2024-10-27 21:33:33.047715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.556 [2024-10-27 21:33:33.047735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.556 #32 NEW cov: 12428 ft: 15186 corp: 21/297b lim: 35 exec/s: 32 rss: 73Mb L: 17/33 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:31.556 [2024-10-27 21:33:33.098316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000058a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.556 [2024-10-27 21:33:33.098344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.556 [2024-10-27 21:33:33.098483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.556 [2024-10-27 21:33:33.098500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.556 [2024-10-27 21:33:33.098643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.556 [2024-10-27 21:33:33.098663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.556 [2024-10-27 21:33:33.098798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.556 [2024-10-27 21:33:33.098817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.557 [2024-10-27 21:33:33.098957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.557 [2024-10-27 21:33:33.098977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:31.557 #33 NEW cov: 12428 ft: 15241 corp: 22/332b lim: 35 exec/s: 33 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:31.557 [2024-10-27 21:33:33.167391] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.557 [2024-10-27 21:33:33.167422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.557 #34 NEW cov: 12428 ft: 15283 corp: 23/343b lim: 35 exec/s: 34 rss: 73Mb L: 11/35 MS: 1 CrossOver- 00:08:31.557 [2024-10-27 21:33:33.237482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.557 [2024-10-27 21:33:33.237512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.557 #35 NEW cov: 12428 ft: 15287 corp: 24/352b lim: 35 exec/s: 35 rss: 73Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:31.816 [2024-10-27 21:33:33.287503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000029a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.287534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.816 #36 NEW cov: 12428 ft: 15303 corp: 25/361b lim: 35 exec/s: 36 rss: 73Mb L: 9/35 MS: 1 ChangeBit- 00:08:31.816 [2024-10-27 21:33:33.337599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.337629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.816 #37 NEW cov: 12428 ft: 15327 corp: 26/368b lim: 35 exec/s: 37 rss: 73Mb L: 7/35 MS: 1 EraseBytes- 00:08:31.816 [2024-10-27 21:33:33.408311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000028a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.408340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.408483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.408504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.408647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.408668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.408816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.408834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.816 #38 NEW cov: 12428 ft: 15330 corp: 27/401b lim: 35 exec/s: 38 rss: 73Mb L: 33/35 MS: 1 ChangeBit- 00:08:31.816 [2024-10-27 21:33:33.478310] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.478339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.478489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.478507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.478639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.478657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.478793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.478812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.816 #39 NEW cov: 12428 ft: 15336 corp: 28/434b lim: 35 exec/s: 39 rss: 73Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:31.816 [2024-10-27 21:33:33.528346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000029a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.528372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.528508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000019 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.528525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.528663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000019 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.528686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.816 [2024-10-27 21:33:33.528835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.816 [2024-10-27 21:33:33.528852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.076 #40 NEW cov: 12428 ft: 15397 corp: 29/462b lim: 35 exec/s: 40 rss: 74Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:08:32.076 [2024-10-27 21:33:33.597646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000008a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.076 [2024-10-27 21:33:33.597673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.076 #41 NEW cov: 12428 ft: 15400 corp: 30/471b lim: 35 exec/s: 20 rss: 74Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:32.076 #41 DONE cov: 12428 ft: 15400 corp: 30/471b lim: 35 exec/s: 20 rss: 74Mb 00:08:32.076 ###### Recommended dictionary. ###### 00:08:32.076 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:32.076 "\001\000" # Uses: 0 00:08:32.076 "\000\000\000\034" # Uses: 0 00:08:32.076 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:32.076 ###### End of recommended dictionary. ###### 00:08:32.076 Done 41 runs in 2 second(s) 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:32.076 21:33:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:32.076 [2024-10-27 21:33:33.784846] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:32.076 [2024-10-27 21:33:33.784911] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3328786 ] 00:08:32.644 [2024-10-27 21:33:34.098071] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:32.644 [2024-10-27 21:33:34.144619] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.644 [2024-10-27 21:33:34.162321] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.644 [2024-10-27 21:33:34.214955] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:32.644 [2024-10-27 21:33:34.231252] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:32.644 INFO: Running with entropic power schedule (0xFF, 100). 00:08:32.644 INFO: Seed: 319157461 00:08:32.644 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:32.644 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:32.644 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.644 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.644 #2 INITED exec/s: 0 rss: 64Mb 00:08:32.644 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.644 This may also happen if the target rejected all inputs we tried so far 00:08:32.644 [2024-10-27 21:33:34.307887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.644 [2024-10-27 21:33:34.307930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.903 NEW_FUNC[1/716]: 0x475898 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:32.903 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.903 #3 NEW cov: 12267 ft: 12266 corp: 2/22b lim: 105 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:33.161 [2024-10-27 21:33:34.647644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.647684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.161 [2024-10-27 21:33:34.647805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.647829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.161 [2024-10-27 21:33:34.647957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.647981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.161 #4 NEW cov: 12403 ft: 13418 corp: 3/101b lim: 105 exec/s: 0 rss: 72Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:33.161 [2024-10-27 21:33:34.707238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:168034304 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.707270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.161 #5 NEW cov: 12409 ft: 13753 corp: 4/122b lim: 105 exec/s: 0 rss: 72Mb L: 21/79 MS: 1 ChangeBit- 00:08:33.161 [2024-10-27 21:33:34.777798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.777832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.161 [2024-10-27 21:33:34.777910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.777934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.161 [2024-10-27 21:33:34.778061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.778085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.161 [2024-10-27 21:33:34.778216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.778241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.161 #6 NEW cov: 12494 ft: 14504 corp: 5/218b lim: 105 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:33.161 [2024-10-27 21:33:34.827239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:168034304 len:256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.161 [2024-10-27 21:33:34.827272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.161 #7 NEW cov: 12494 ft: 14581 corp: 6/239b lim: 105 exec/s: 0 rss: 72Mb L: 21/96 MS: 1 ChangeBinInt- 00:08:33.420 [2024-10-27 21:33:34.897375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:34.897406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.420 #18 NEW cov: 12494 ft: 14686 corp: 7/276b lim: 105 exec/s: 0 rss: 72Mb L: 37/96 MS: 1 CopyPart- 00:08:33.420 [2024-10-27 21:33:34.947932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:34.947972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.420 [2024-10-27 21:33:34.948078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:34.948104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.420 [2024-10-27 21:33:34.948224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:34.948246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.420 [2024-10-27 21:33:34.948375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:34.948400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.420 #19 NEW cov: 12494 ft: 14755 corp: 8/380b lim: 105 exec/s: 0 rss: 72Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:33.420 [2024-10-27 21:33:35.017794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18396445050281131955 len:19790 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:35.017825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.420 [2024-10-27 21:33:35.017938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:35.017967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.420 [2024-10-27 21:33:35.018104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:35.018132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.420 #22 NEW cov: 12494 ft: 14819 corp: 9/460b lim: 105 exec/s: 0 rss: 73Mb L: 80/104 MS: 3 CMP-InsertByte-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377\000"- 00:08:33.420 [2024-10-27 21:33:35.067740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.420 [2024-10-27 21:33:35.067772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.420 #23 NEW cov: 12503 ft: 14976 corp: 10/482b lim: 105 exec/s: 0 rss: 73Mb L: 22/104 MS: 1 InsertByte- 00:08:33.420 [2024-10-27 21:33:35.117418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.421 [2024-10-27 21:33:35.117451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.421 #24 NEW cov: 12503 ft: 15046 corp: 11/518b lim: 105 exec/s: 0 rss: 73Mb L: 36/104 MS: 1 InsertRepeatedBytes- 00:08:33.678 [2024-10-27 21:33:35.167537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.678 [2024-10-27 21:33:35.167565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.678 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:33.678 #25 NEW cov: 12526 ft: 15125 corp: 12/544b lim: 105 exec/s: 0 rss: 73Mb L: 26/104 MS: 1 InsertRepeatedBytes- 00:08:33.678 [2024-10-27 21:33:35.217629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772168 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.678 [2024-10-27 21:33:35.217656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.678 #31 NEW cov: 12526 ft: 15207 corp: 13/565b lim: 105 exec/s: 0 rss: 73Mb L: 21/104 MS: 1 ChangeBit- 00:08:33.678 [2024-10-27 21:33:35.267409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.678 [2024-10-27 21:33:35.267437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.678 #32 NEW cov: 12526 ft: 15235 corp: 14/594b lim: 105 exec/s: 32 rss: 73Mb L: 29/104 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\000"- 00:08:33.678 [2024-10-27 21:33:35.317603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169345024 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.678 [2024-10-27 21:33:35.317629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.678 #33 NEW cov: 12526 ft: 15262 corp: 15/630b lim: 105 exec/s: 33 rss: 73Mb L: 36/104 MS: 1 CMP- DE: "\030\000"- 00:08:33.678 [2024-10-27 21:33:35.387689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2251799981457416 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.678 [2024-10-27 21:33:35.387724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.937 #34 NEW cov: 12526 ft: 15299 corp: 16/651b lim: 105 exec/s: 34 rss: 73Mb L: 21/104 MS: 1 ChangeBinInt- 00:08:33.937 [2024-10-27 21:33:35.457741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582618879 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.457774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.937 #35 NEW cov: 12526 ft: 15354 corp: 17/672b lim: 105 exec/s: 35 rss: 73Mb L: 21/104 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\000"- 00:08:33.937 [2024-10-27 21:33:35.507731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772168 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.507757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.937 #36 NEW cov: 12526 ft: 15364 corp: 18/693b lim: 105 exec/s: 36 rss: 73Mb L: 21/104 MS: 1 CopyPart- 00:08:33.937 [2024-10-27 21:33:35.558309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:29299 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.558341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.937 [2024-10-27 21:33:35.558417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.558439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.937 [2024-10-27 21:33:35.558557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.558581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.937 [2024-10-27 21:33:35.558702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.558722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.937 #37 NEW cov: 12526 ft: 15420 corp: 19/795b lim: 105 exec/s: 37 rss: 73Mb L: 102/104 MS: 1 InsertRepeatedBytes- 00:08:33.937 [2024-10-27 21:33:35.617784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169345024 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.937 [2024-10-27 21:33:35.617821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.937 #38 NEW cov: 12526 ft: 15450 corp: 20/832b lim: 105 exec/s: 38 rss: 73Mb L: 37/104 MS: 1 InsertByte- 00:08:34.196 [2024-10-27 21:33:35.687757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.687788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.196 #39 NEW cov: 12526 ft: 15461 corp: 21/854b lim: 105 exec/s: 39 rss: 73Mb L: 22/104 MS: 1 CopyPart- 00:08:34.196 [2024-10-27 21:33:35.757818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2251799981457416 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.757851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.196 #40 NEW cov: 12526 ft: 15473 corp: 22/875b lim: 105 exec/s: 40 rss: 73Mb L: 21/104 MS: 1 ShuffleBytes- 00:08:34.196 [2024-10-27 21:33:35.827844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.827875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.196 #41 NEW cov: 12526 ft: 15475 corp: 23/897b lim: 105 exec/s: 41 rss: 73Mb L: 22/104 MS: 1 ShuffleBytes- 00:08:34.196 [2024-10-27 21:33:35.898551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:209 len:29299 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.898587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.196 [2024-10-27 21:33:35.898698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.898724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.196 [2024-10-27 21:33:35.898836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.898857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.196 [2024-10-27 21:33:35.898988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.196 [2024-10-27 21:33:35.899014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.455 #42 NEW cov: 12526 ft: 15487 corp: 24/1000b lim: 105 exec/s: 42 rss: 74Mb L: 103/104 MS: 1 InsertByte- 00:08:34.455 [2024-10-27 21:33:35.968010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582618879 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.455 [2024-10-27 21:33:35.968038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.455 #43 NEW cov: 12526 ft: 15554 corp: 25/1039b lim: 105 exec/s: 43 rss: 74Mb L: 39/104 MS: 1 CopyPart- 00:08:34.455 [2024-10-27 21:33:36.038000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.455 [2024-10-27 21:33:36.038034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.455 #44 NEW cov: 12526 ft: 15599 corp: 26/1068b lim: 105 exec/s: 44 rss: 74Mb L: 29/104 MS: 1 ChangeBinInt- 00:08:34.455 [2024-10-27 21:33:36.108525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18396445050281131955 len:19790 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.455 [2024-10-27 21:33:36.108558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.455 [2024-10-27 21:33:36.108663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.455 [2024-10-27 21:33:36.108688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.455 [2024-10-27 21:33:36.108815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.455 [2024-10-27 21:33:36.108837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.455 #45 NEW cov: 12526 ft: 15638 corp: 27/1148b lim: 105 exec/s: 45 rss: 74Mb L: 80/104 MS: 1 ChangeBinInt- 00:08:34.455 [2024-10-27 21:33:36.178158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.455 [2024-10-27 21:33:36.178186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.714 #46 NEW cov: 12526 ft: 15646 corp: 28/1169b lim: 105 exec/s: 46 rss: 74Mb L: 21/104 MS: 1 ShuffleBytes- 00:08:34.714 [2024-10-27 21:33:36.228778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.714 [2024-10-27 21:33:36.228812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.714 [2024-10-27 21:33:36.228949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.714 [2024-10-27 21:33:36.228974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.714 [2024-10-27 21:33:36.229105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.715 [2024-10-27 21:33:36.229129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.715 [2024-10-27 21:33:36.229258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.715 [2024-10-27 21:33:36.229280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.715 #47 NEW cov: 12526 ft: 15660 corp: 29/1271b lim: 105 exec/s: 47 rss: 74Mb L: 102/104 MS: 1 InsertRepeatedBytes- 00:08:34.715 [2024-10-27 21:33:36.278833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.715 [2024-10-27 21:33:36.278869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.715 [2024-10-27 21:33:36.278969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:38401 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.715 [2024-10-27 21:33:36.278996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.715 [2024-10-27 21:33:36.279121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.715 [2024-10-27 21:33:36.279145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.715 [2024-10-27 21:33:36.279270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.715 [2024-10-27 21:33:36.279294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.715 #48 NEW cov: 12526 ft: 15677 corp: 30/1368b lim: 105 exec/s: 24 rss: 74Mb L: 97/104 MS: 1 InsertByte- 00:08:34.715 #48 DONE cov: 12526 ft: 15677 corp: 30/1368b lim: 105 exec/s: 24 rss: 74Mb 00:08:34.715 ###### Recommended dictionary. ###### 00:08:34.715 "\377\377\377\377\377\377\377\000" # Uses: 2 00:08:34.715 "\030\000" # Uses: 0 00:08:34.715 ###### End of recommended dictionary. ###### 00:08:34.715 Done 48 runs in 2 second(s) 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:34.715 21:33:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:34.974 [2024-10-27 21:33:36.451674] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:34.974 [2024-10-27 21:33:36.451765] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3329320 ] 00:08:35.233 [2024-10-27 21:33:36.771159] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:35.233 [2024-10-27 21:33:36.818602] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.233 [2024-10-27 21:33:36.840417] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.233 [2024-10-27 21:33:36.893053] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.233 [2024-10-27 21:33:36.909360] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:35.233 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.233 INFO: Seed: 2996158084 00:08:35.233 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:35.233 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:35.233 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:35.233 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.233 #2 INITED exec/s: 0 rss: 65Mb 00:08:35.233 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.233 This may also happen if the target rejected all inputs we tried so far 00:08:35.233 [2024-10-27 21:33:36.957917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.233 [2024-10-27 21:33:36.957961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 NEW_FUNC[1/717]: 0x478c18 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:35.750 NEW_FUNC[2/717]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:35.750 #9 NEW cov: 12310 ft: 12309 corp: 2/33b lim: 120 exec/s: 0 rss: 72Mb L: 32/32 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:35.750 [2024-10-27 21:33:37.307881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.750 [2024-10-27 21:33:37.307921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 #10 NEW cov: 12425 ft: 12943 corp: 3/65b lim: 120 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBit- 00:08:35.750 [2024-10-27 21:33:37.397785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.750 [2024-10-27 21:33:37.397818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 #11 NEW cov: 12431 ft: 13190 corp: 4/97b lim: 120 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:36.008 [2024-10-27 21:33:37.487809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.487845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.008 #12 NEW cov: 12516 ft: 13468 corp: 5/129b lim: 120 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBit- 00:08:36.008 [2024-10-27 21:33:37.547817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:34953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.547849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.008 #13 NEW cov: 12516 ft: 13612 corp: 6/166b lim: 120 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 CopyPart- 00:08:36.008 [2024-10-27 21:33:37.608008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14540374737248963017 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.608038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.008 [2024-10-27 21:33:37.608085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.608103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.008 [2024-10-27 21:33:37.608133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.608150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.008 [2024-10-27 21:33:37.608179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.608195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.008 #14 NEW cov: 12516 ft: 14629 corp: 7/275b lim: 120 exec/s: 0 rss: 72Mb L: 109/109 MS: 1 InsertRepeatedBytes- 00:08:36.008 [2024-10-27 21:33:37.667823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.008 [2024-10-27 21:33:37.667853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.266 #15 NEW cov: 12516 ft: 14773 corp: 8/308b lim: 120 exec/s: 0 rss: 72Mb L: 33/109 MS: 1 InsertByte- 00:08:36.266 [2024-10-27 21:33:37.757867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.757898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.266 #16 NEW cov: 12516 ft: 14830 corp: 9/340b lim: 120 exec/s: 0 rss: 73Mb L: 32/109 MS: 1 ShuffleBytes- 00:08:36.266 [2024-10-27 21:33:37.807902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9838263507974916232 len:65417 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.807934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.266 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:36.266 #19 NEW cov: 12533 ft: 14887 corp: 10/366b lim: 120 exec/s: 0 rss: 73Mb L: 26/109 MS: 3 EraseBytes-CrossOver-CopyPart- 00:08:36.266 [2024-10-27 21:33:37.857864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65528 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.857894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.266 #20 NEW cov: 12533 ft: 14933 corp: 11/401b lim: 120 exec/s: 0 rss: 73Mb L: 35/109 MS: 1 CopyPart- 00:08:36.266 [2024-10-27 21:33:37.908878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9838263507974916232 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.908906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.266 [2024-10-27 21:33:37.908974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.908990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.266 [2024-10-27 21:33:37.909044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.909059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.266 [2024-10-27 21:33:37.909114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.909129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.266 #21 NEW cov: 12533 ft: 14999 corp: 12/509b lim: 120 exec/s: 21 rss: 73Mb L: 108/109 MS: 1 InsertRepeatedBytes- 00:08:36.266 [2024-10-27 21:33:37.968440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:34953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.266 [2024-10-27 21:33:37.968467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.524 #22 NEW cov: 12533 ft: 15037 corp: 13/547b lim: 120 exec/s: 22 rss: 73Mb L: 38/109 MS: 1 InsertByte- 00:08:36.524 [2024-10-27 21:33:38.028497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:34953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.028525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.524 #23 NEW cov: 12533 ft: 15203 corp: 14/584b lim: 120 exec/s: 23 rss: 73Mb L: 37/109 MS: 1 ChangeByte- 00:08:36.524 [2024-10-27 21:33:38.068903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4287102976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.068931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.524 [2024-10-27 21:33:38.068986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.069002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.524 [2024-10-27 21:33:38.069057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.069073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.524 [2024-10-27 21:33:38.069124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.069140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.524 #24 NEW cov: 12533 ft: 15231 corp: 15/694b lim: 120 exec/s: 24 rss: 73Mb L: 110/110 MS: 1 InsertRepeatedBytes- 00:08:36.524 [2024-10-27 21:33:38.108486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.108512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.524 #25 NEW cov: 12533 ft: 15259 corp: 16/724b lim: 120 exec/s: 25 rss: 73Mb L: 30/110 MS: 1 EraseBytes- 00:08:36.524 [2024-10-27 21:33:38.168538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18392982353157816319 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.168564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.524 #26 NEW cov: 12533 ft: 15272 corp: 17/756b lim: 120 exec/s: 26 rss: 73Mb L: 32/110 MS: 1 ChangeByte- 00:08:36.524 [2024-10-27 21:33:38.208540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9838263507974916232 len:65417 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.208566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.524 #27 NEW cov: 12533 ft: 15291 corp: 18/782b lim: 120 exec/s: 27 rss: 73Mb L: 26/110 MS: 1 ShuffleBytes- 00:08:36.524 [2024-10-27 21:33:38.248546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.524 [2024-10-27 21:33:38.248573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.781 #28 NEW cov: 12533 ft: 15306 corp: 19/814b lim: 120 exec/s: 28 rss: 73Mb L: 32/110 MS: 1 ChangeByte- 00:08:36.782 [2024-10-27 21:33:38.289036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4287102976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.289064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.289110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.289125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.289178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.289194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.289247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.289262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.782 #29 NEW cov: 12533 ft: 15345 corp: 20/924b lim: 120 exec/s: 29 rss: 73Mb L: 110/110 MS: 1 ChangeBit- 00:08:36.782 [2024-10-27 21:33:38.348757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.348784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.348835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65417 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.348851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.782 #30 NEW cov: 12533 ft: 15722 corp: 21/973b lim: 120 exec/s: 30 rss: 73Mb L: 49/110 MS: 1 InsertRepeatedBytes- 00:08:36.782 [2024-10-27 21:33:38.389077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14540374737248963017 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.389104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.389150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.389166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.389221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.389236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.782 [2024-10-27 21:33:38.389289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14540374740453411273 len:51658 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.389304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.782 #31 NEW cov: 12533 ft: 15730 corp: 22/1082b lim: 120 exec/s: 31 rss: 73Mb L: 109/110 MS: 1 ChangeBit- 00:08:36.782 [2024-10-27 21:33:38.448653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446613231825846271 len:34953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.782 [2024-10-27 21:33:38.448680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.782 #32 NEW cov: 12533 ft: 15773 corp: 23/1120b lim: 120 exec/s: 32 rss: 73Mb L: 38/110 MS: 1 ShuffleBytes- 00:08:37.040 [2024-10-27 21:33:38.508729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65528 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.508757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.040 #33 NEW cov: 12533 ft: 15781 corp: 24/1156b lim: 120 exec/s: 33 rss: 73Mb L: 36/110 MS: 1 InsertByte- 00:08:37.040 [2024-10-27 21:33:38.568703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446613231825780735 len:34953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.568731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.040 #34 NEW cov: 12533 ft: 15794 corp: 25/1194b lim: 120 exec/s: 34 rss: 73Mb L: 38/110 MS: 1 ChangeBinInt- 00:08:37.040 [2024-10-27 21:33:38.629022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.629049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.040 [2024-10-27 21:33:38.629094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.629110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.040 [2024-10-27 21:33:38.629164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.629180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.040 #35 NEW cov: 12533 ft: 16083 corp: 26/1285b lim: 120 exec/s: 35 rss: 73Mb L: 91/110 MS: 1 InsertRepeatedBytes- 00:08:37.040 [2024-10-27 21:33:38.668711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9838263505978427528 len:34827 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.668738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.040 #38 NEW cov: 12533 ft: 16098 corp: 27/1321b lim: 120 exec/s: 38 rss: 74Mb L: 36/110 MS: 3 EraseBytes-EraseBytes-CopyPart- 00:08:37.040 [2024-10-27 21:33:38.728779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.040 [2024-10-27 21:33:38.728806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.299 #39 NEW cov: 12533 ft: 16130 corp: 28/1347b lim: 120 exec/s: 39 rss: 74Mb L: 26/110 MS: 1 EraseBytes- 00:08:37.299 [2024-10-27 21:33:38.789238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.789265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.789312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.789328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.789383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.789397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.789453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.789468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.829238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.829264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.829317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.829332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.829385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.829399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.300 [2024-10-27 21:33:38.829452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446743562608443391 len:34953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.829467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.300 #41 NEW cov: 12540 ft: 16160 corp: 29/1451b lim: 120 exec/s: 41 rss: 74Mb L: 104/110 MS: 2 InsertRepeatedBytes-CopyPart- 00:08:37.300 [2024-10-27 21:33:38.868844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9838263505978427528 len:34827 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.868871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.300 #42 NEW cov: 12540 ft: 16215 corp: 30/1487b lim: 120 exec/s: 42 rss: 74Mb L: 36/110 MS: 1 ChangeBit- 00:08:37.300 [2024-10-27 21:33:38.928900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9838263507974916232 len:65417 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.300 [2024-10-27 21:33:38.928927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.300 #43 NEW cov: 12540 ft: 16225 corp: 31/1513b lim: 120 exec/s: 21 rss: 74Mb L: 26/110 MS: 1 ChangeByte- 00:08:37.300 #43 DONE cov: 12540 ft: 16225 corp: 31/1513b lim: 120 exec/s: 21 rss: 74Mb 00:08:37.300 Done 43 runs in 2 second(s) 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:37.558 21:33:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:37.558 [2024-10-27 21:33:39.096016] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:37.559 [2024-10-27 21:33:39.096094] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3329642 ] 00:08:37.817 [2024-10-27 21:33:39.414698] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:37.817 [2024-10-27 21:33:39.460858] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.817 [2024-10-27 21:33:39.477887] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.817 [2024-10-27 21:33:39.530255] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.078 [2024-10-27 21:33:39.546579] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:38.078 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.078 INFO: Seed: 1340177037 00:08:38.078 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:38.078 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:38.078 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:38.078 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.078 #2 INITED exec/s: 0 rss: 65Mb 00:08:38.078 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.078 This may also happen if the target rejected all inputs we tried so far 00:08:38.078 [2024-10-27 21:33:39.601840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.078 [2024-10-27 21:33:39.601869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.078 [2024-10-27 21:33:39.601909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.078 [2024-10-27 21:33:39.601923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.078 [2024-10-27 21:33:39.601975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.078 [2024-10-27 21:33:39.601989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.336 NEW_FUNC[1/715]: 0x47c508 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:38.336 NEW_FUNC[2/715]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.336 #23 NEW cov: 12255 ft: 12252 corp: 2/67b lim: 100 exec/s: 0 rss: 72Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:08:38.336 [2024-10-27 21:33:39.932076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.336 [2024-10-27 21:33:39.932108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.336 [2024-10-27 21:33:39.932147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.336 [2024-10-27 21:33:39.932160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.336 [2024-10-27 21:33:39.932212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.336 [2024-10-27 21:33:39.932226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.336 [2024-10-27 21:33:39.932276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.336 [2024-10-27 21:33:39.932290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.336 #24 NEW cov: 12368 ft: 13083 corp: 3/157b lim: 100 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:38.336 [2024-10-27 21:33:39.991951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.336 [2024-10-27 21:33:39.991978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.336 [2024-10-27 21:33:39.992026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.336 [2024-10-27 21:33:39.992038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.337 [2024-10-27 21:33:39.992089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.337 [2024-10-27 21:33:39.992102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.337 [2024-10-27 21:33:39.992154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.337 [2024-10-27 21:33:39.992167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.337 #35 NEW cov: 12374 ft: 13279 corp: 4/247b lim: 100 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:38.337 [2024-10-27 21:33:40.052005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.337 [2024-10-27 21:33:40.052035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.337 [2024-10-27 21:33:40.052072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.337 [2024-10-27 21:33:40.052086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.337 [2024-10-27 21:33:40.052138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.337 [2024-10-27 21:33:40.052152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.337 [2024-10-27 21:33:40.052203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.337 [2024-10-27 21:33:40.052217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.595 #36 NEW cov: 12459 ft: 13536 corp: 5/337b lim: 100 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 ChangeBit- 00:08:38.595 [2024-10-27 21:33:40.112046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.595 [2024-10-27 21:33:40.112076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.595 [2024-10-27 21:33:40.112111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.595 [2024-10-27 21:33:40.112125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.595 [2024-10-27 21:33:40.112175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.596 [2024-10-27 21:33:40.112190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.112243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.596 [2024-10-27 21:33:40.112258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.596 #37 NEW cov: 12459 ft: 13719 corp: 6/427b lim: 100 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 ChangeByte- 00:08:38.596 [2024-10-27 21:33:40.152038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.596 [2024-10-27 21:33:40.152065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.152108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.596 [2024-10-27 21:33:40.152122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.152173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.596 [2024-10-27 21:33:40.152202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.152253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.596 [2024-10-27 21:33:40.152266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.596 #38 NEW cov: 12459 ft: 13871 corp: 7/517b lim: 100 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 ChangeByte- 00:08:38.596 [2024-10-27 21:33:40.191651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.596 [2024-10-27 21:33:40.191677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.596 #42 NEW cov: 12459 ft: 14312 corp: 8/548b lim: 100 exec/s: 0 rss: 73Mb L: 31/90 MS: 4 ChangeBit-ChangeByte-CMP-InsertRepeatedBytes- DE: "\377\377"- 00:08:38.596 [2024-10-27 21:33:40.232086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.596 [2024-10-27 21:33:40.232118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.232152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.596 [2024-10-27 21:33:40.232166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.232217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.596 [2024-10-27 21:33:40.232231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.232280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.596 [2024-10-27 21:33:40.232294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.596 #43 NEW cov: 12459 ft: 14337 corp: 9/638b lim: 100 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:38.596 [2024-10-27 21:33:40.272054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.596 [2024-10-27 21:33:40.272079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.272143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.596 [2024-10-27 21:33:40.272156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.272206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.596 [2024-10-27 21:33:40.272220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.596 [2024-10-27 21:33:40.272271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.596 [2024-10-27 21:33:40.272286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.596 #44 NEW cov: 12459 ft: 14445 corp: 10/736b lim: 100 exec/s: 0 rss: 73Mb L: 98/98 MS: 1 CrossOver- 00:08:38.855 [2024-10-27 21:33:40.332072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.855 [2024-10-27 21:33:40.332097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.332144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.855 [2024-10-27 21:33:40.332158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.332208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.855 [2024-10-27 21:33:40.332222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.332272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.855 [2024-10-27 21:33:40.332284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.855 #45 NEW cov: 12459 ft: 14488 corp: 11/834b lim: 100 exec/s: 0 rss: 73Mb L: 98/98 MS: 1 ChangeBit- 00:08:38.855 [2024-10-27 21:33:40.392108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.855 [2024-10-27 21:33:40.392135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.392187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.855 [2024-10-27 21:33:40.392204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.392257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.855 [2024-10-27 21:33:40.392272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.392327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.855 [2024-10-27 21:33:40.392341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.855 #46 NEW cov: 12459 ft: 14594 corp: 12/924b lim: 100 exec/s: 0 rss: 73Mb L: 90/98 MS: 1 ChangeBinInt- 00:08:38.855 [2024-10-27 21:33:40.432119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.855 [2024-10-27 21:33:40.432145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.432211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.855 [2024-10-27 21:33:40.432225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.432273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.855 [2024-10-27 21:33:40.432287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.432338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.855 [2024-10-27 21:33:40.432352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.855 #47 NEW cov: 12459 ft: 14606 corp: 13/1015b lim: 100 exec/s: 0 rss: 73Mb L: 91/98 MS: 1 InsertByte- 00:08:38.855 [2024-10-27 21:33:40.472171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.855 [2024-10-27 21:33:40.472198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.472239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.855 [2024-10-27 21:33:40.472254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.472303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.855 [2024-10-27 21:33:40.472317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.472368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.855 [2024-10-27 21:33:40.472381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.855 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:38.855 #48 NEW cov: 12482 ft: 14657 corp: 14/1105b lim: 100 exec/s: 0 rss: 73Mb L: 90/98 MS: 1 ChangeBinInt- 00:08:38.855 [2024-10-27 21:33:40.532083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.855 [2024-10-27 21:33:40.532109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.532247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.855 [2024-10-27 21:33:40.532258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.855 [2024-10-27 21:33:40.532275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.855 [2024-10-27 21:33:40.532288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.855 #49 NEW cov: 12482 ft: 14696 corp: 15/1175b lim: 100 exec/s: 0 rss: 74Mb L: 70/98 MS: 1 EraseBytes- 00:08:38.855 [2024-10-27 21:33:40.572158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.855 [2024-10-27 21:33:40.572184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.856 [2024-10-27 21:33:40.572236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.856 [2024-10-27 21:33:40.572250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.856 [2024-10-27 21:33:40.572300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.856 [2024-10-27 21:33:40.572315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.856 [2024-10-27 21:33:40.572367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.856 [2024-10-27 21:33:40.572381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.115 #50 NEW cov: 12482 ft: 14748 corp: 16/1260b lim: 100 exec/s: 50 rss: 74Mb L: 85/98 MS: 1 EraseBytes- 00:08:39.115 [2024-10-27 21:33:40.612099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.115 [2024-10-27 21:33:40.612127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.612171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.115 [2024-10-27 21:33:40.612184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.612235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.115 [2024-10-27 21:33:40.612248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.115 #51 NEW cov: 12482 ft: 14781 corp: 17/1338b lim: 100 exec/s: 51 rss: 74Mb L: 78/98 MS: 1 CrossOver- 00:08:39.115 [2024-10-27 21:33:40.672275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.115 [2024-10-27 21:33:40.672302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.672368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.115 [2024-10-27 21:33:40.672382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.672434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.115 [2024-10-27 21:33:40.672449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.672501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.115 [2024-10-27 21:33:40.672515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.115 #52 NEW cov: 12482 ft: 14808 corp: 18/1436b lim: 100 exec/s: 52 rss: 74Mb L: 98/98 MS: 1 ChangeBit- 00:08:39.115 [2024-10-27 21:33:40.732298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.115 [2024-10-27 21:33:40.732325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.732374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.115 [2024-10-27 21:33:40.732388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.732439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.115 [2024-10-27 21:33:40.732452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.732505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.115 [2024-10-27 21:33:40.732518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.115 #53 NEW cov: 12482 ft: 14865 corp: 19/1535b lim: 100 exec/s: 53 rss: 74Mb L: 99/99 MS: 1 InsertByte- 00:08:39.115 [2024-10-27 21:33:40.792266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.115 [2024-10-27 21:33:40.792293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.792340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.115 [2024-10-27 21:33:40.792352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.792402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.115 [2024-10-27 21:33:40.792416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.792470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.115 [2024-10-27 21:33:40.792483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.115 #54 NEW cov: 12482 ft: 14916 corp: 20/1625b lim: 100 exec/s: 54 rss: 74Mb L: 90/99 MS: 1 ShuffleBytes- 00:08:39.115 [2024-10-27 21:33:40.832179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.115 [2024-10-27 21:33:40.832205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.832238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.115 [2024-10-27 21:33:40.832251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.115 [2024-10-27 21:33:40.832301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.115 [2024-10-27 21:33:40.832316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.374 #55 NEW cov: 12482 ft: 14937 corp: 21/1691b lim: 100 exec/s: 55 rss: 74Mb L: 66/99 MS: 1 ChangeBit- 00:08:39.374 [2024-10-27 21:33:40.872329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.374 [2024-10-27 21:33:40.872355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.872415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.374 [2024-10-27 21:33:40.872429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.872474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.374 [2024-10-27 21:33:40.872487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.872540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.374 [2024-10-27 21:33:40.872554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.374 #56 NEW cov: 12482 ft: 14978 corp: 22/1781b lim: 100 exec/s: 56 rss: 74Mb L: 90/99 MS: 1 ChangeByte- 00:08:39.374 [2024-10-27 21:33:40.912350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.374 [2024-10-27 21:33:40.912376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.912423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.374 [2024-10-27 21:33:40.912438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.912487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.374 [2024-10-27 21:33:40.912501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.912551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.374 [2024-10-27 21:33:40.912564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.374 #57 NEW cov: 12482 ft: 14989 corp: 23/1880b lim: 100 exec/s: 57 rss: 74Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:39.374 [2024-10-27 21:33:40.972178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.374 [2024-10-27 21:33:40.972204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:40.972239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.374 [2024-10-27 21:33:40.972252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.374 #58 NEW cov: 12482 ft: 15234 corp: 24/1938b lim: 100 exec/s: 58 rss: 74Mb L: 58/99 MS: 1 InsertRepeatedBytes- 00:08:39.374 [2024-10-27 21:33:41.012132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.374 [2024-10-27 21:33:41.012159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:41.012220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.374 [2024-10-27 21:33:41.012234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.374 #59 NEW cov: 12482 ft: 15248 corp: 25/1992b lim: 100 exec/s: 59 rss: 74Mb L: 54/99 MS: 1 EraseBytes- 00:08:39.374 [2024-10-27 21:33:41.072384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.374 [2024-10-27 21:33:41.072410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:41.072476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.374 [2024-10-27 21:33:41.072491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.374 [2024-10-27 21:33:41.072543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.375 [2024-10-27 21:33:41.072556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.375 [2024-10-27 21:33:41.072607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.375 [2024-10-27 21:33:41.072622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.633 #60 NEW cov: 12482 ft: 15264 corp: 26/2083b lim: 100 exec/s: 60 rss: 74Mb L: 91/99 MS: 1 InsertByte- 00:08:39.633 [2024-10-27 21:33:41.132432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.633 [2024-10-27 21:33:41.132458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.132506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.634 [2024-10-27 21:33:41.132521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.132572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.634 [2024-10-27 21:33:41.132585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.132637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.634 [2024-10-27 21:33:41.132651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.634 #61 NEW cov: 12482 ft: 15276 corp: 27/2173b lim: 100 exec/s: 61 rss: 74Mb L: 90/99 MS: 1 ShuffleBytes- 00:08:39.634 [2024-10-27 21:33:41.192458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.634 [2024-10-27 21:33:41.192484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.192536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.634 [2024-10-27 21:33:41.192549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.192600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.634 [2024-10-27 21:33:41.192614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.192665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.634 [2024-10-27 21:33:41.192679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.634 #62 NEW cov: 12482 ft: 15283 corp: 28/2264b lim: 100 exec/s: 62 rss: 74Mb L: 91/99 MS: 1 CrossOver- 00:08:39.634 [2024-10-27 21:33:41.252483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.634 [2024-10-27 21:33:41.252508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.252554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.634 [2024-10-27 21:33:41.252568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.252619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.634 [2024-10-27 21:33:41.252633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.252684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.634 [2024-10-27 21:33:41.252698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.634 #63 NEW cov: 12482 ft: 15293 corp: 29/2352b lim: 100 exec/s: 63 rss: 74Mb L: 88/99 MS: 1 InsertRepeatedBytes- 00:08:39.634 [2024-10-27 21:33:41.292496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.634 [2024-10-27 21:33:41.292525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.292560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.634 [2024-10-27 21:33:41.292574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.292624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.634 [2024-10-27 21:33:41.292637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.292702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.634 [2024-10-27 21:33:41.292717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.332489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.634 [2024-10-27 21:33:41.332513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.332578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.634 [2024-10-27 21:33:41.332591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.332641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.634 [2024-10-27 21:33:41.332656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.634 [2024-10-27 21:33:41.332707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.634 [2024-10-27 21:33:41.332721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.634 #65 NEW cov: 12482 ft: 15301 corp: 30/2451b lim: 100 exec/s: 65 rss: 74Mb L: 99/99 MS: 2 ChangeBit-InsertByte- 00:08:39.893 [2024-10-27 21:33:41.372563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.893 [2024-10-27 21:33:41.372589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.372643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.893 [2024-10-27 21:33:41.372656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.372707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.893 [2024-10-27 21:33:41.372721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.372773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.893 [2024-10-27 21:33:41.372787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.893 #66 NEW cov: 12482 ft: 15306 corp: 31/2541b lim: 100 exec/s: 66 rss: 74Mb L: 90/99 MS: 1 CrossOver- 00:08:39.893 [2024-10-27 21:33:41.412606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.893 [2024-10-27 21:33:41.412633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.412700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.893 [2024-10-27 21:33:41.412713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.412765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.893 [2024-10-27 21:33:41.412790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.412840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.893 [2024-10-27 21:33:41.412854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.893 #67 NEW cov: 12482 ft: 15316 corp: 32/2639b lim: 100 exec/s: 67 rss: 74Mb L: 98/99 MS: 1 ChangeByte- 00:08:39.893 [2024-10-27 21:33:41.452574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.893 [2024-10-27 21:33:41.452599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.452666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.893 [2024-10-27 21:33:41.452679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.452730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.893 [2024-10-27 21:33:41.452744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.452794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.893 [2024-10-27 21:33:41.452807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.893 #68 NEW cov: 12482 ft: 15321 corp: 33/2731b lim: 100 exec/s: 68 rss: 74Mb L: 92/99 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:39.893 [2024-10-27 21:33:41.492637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.893 [2024-10-27 21:33:41.492663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.492714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.893 [2024-10-27 21:33:41.492727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.492777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.893 [2024-10-27 21:33:41.492791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.893 [2024-10-27 21:33:41.492841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.893 [2024-10-27 21:33:41.492854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.893 #69 NEW cov: 12482 ft: 15339 corp: 34/2822b lim: 100 exec/s: 69 rss: 74Mb L: 91/99 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:39.893 [2024-10-27 21:33:41.532596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.894 [2024-10-27 21:33:41.532623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.894 [2024-10-27 21:33:41.532669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.894 [2024-10-27 21:33:41.532682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.894 [2024-10-27 21:33:41.532732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.894 [2024-10-27 21:33:41.532746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.894 [2024-10-27 21:33:41.532798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.894 [2024-10-27 21:33:41.532811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.894 #70 NEW cov: 12482 ft: 15350 corp: 35/2921b lim: 100 exec/s: 70 rss: 75Mb L: 99/99 MS: 1 ShuffleBytes- 00:08:39.894 [2024-10-27 21:33:41.592664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.894 [2024-10-27 21:33:41.592691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.894 [2024-10-27 21:33:41.592740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.894 [2024-10-27 21:33:41.592755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.894 [2024-10-27 21:33:41.592806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.894 [2024-10-27 21:33:41.592820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.894 [2024-10-27 21:33:41.592873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.894 [2024-10-27 21:33:41.592887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.894 #71 NEW cov: 12482 ft: 15362 corp: 36/3011b lim: 100 exec/s: 35 rss: 75Mb L: 90/99 MS: 1 ShuffleBytes- 00:08:39.894 #71 DONE cov: 12482 ft: 15362 corp: 36/3011b lim: 100 exec/s: 35 rss: 75Mb 00:08:39.894 ###### Recommended dictionary. ###### 00:08:39.894 "\377\377" # Uses: 2 00:08:39.894 ###### End of recommended dictionary. ###### 00:08:39.894 Done 71 runs in 2 second(s) 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:40.153 21:33:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:40.153 [2024-10-27 21:33:41.761869] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:40.153 [2024-10-27 21:33:41.761937] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3330148 ] 00:08:40.413 [2024-10-27 21:33:42.096397] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:40.673 [2024-10-27 21:33:42.142818] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.673 [2024-10-27 21:33:42.158239] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.673 [2024-10-27 21:33:42.210593] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.673 [2024-10-27 21:33:42.226910] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:40.673 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.673 INFO: Seed: 4018180826 00:08:40.673 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:40.673 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:40.673 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.673 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.673 #2 INITED exec/s: 0 rss: 64Mb 00:08:40.673 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.673 This may also happen if the target rejected all inputs we tried so far 00:08:40.673 [2024-10-27 21:33:42.296536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:40.673 [2024-10-27 21:33:42.296580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.673 [2024-10-27 21:33:42.296650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:40.673 [2024-10-27 21:33:42.296667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.673 [2024-10-27 21:33:42.296737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:40.673 [2024-10-27 21:33:42.296753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.932 NEW_FUNC[1/715]: 0x47f4c8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:40.932 NEW_FUNC[2/715]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.932 #3 NEW cov: 12231 ft: 12232 corp: 2/33b lim: 50 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:40.932 [2024-10-27 21:33:42.635683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:40.932 [2024-10-27 21:33:42.635724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.932 [2024-10-27 21:33:42.635842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700868333697787 len:1286 00:08:40.932 [2024-10-27 21:33:42.635865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.932 [2024-10-27 21:33:42.635992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:40.932 [2024-10-27 21:33:42.636019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.191 #4 NEW cov: 12346 ft: 12898 corp: 3/65b lim: 50 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:41.191 [2024-10-27 21:33:42.695454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:41.191 [2024-10-27 21:33:42.695488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.695606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:41.191 [2024-10-27 21:33:42.695627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.191 #5 NEW cov: 12352 ft: 13432 corp: 4/91b lim: 50 exec/s: 0 rss: 72Mb L: 26/32 MS: 1 EraseBytes- 00:08:41.191 [2024-10-27 21:33:42.735807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:41.191 [2024-10-27 21:33:42.735836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.735924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700868333697787 len:1286 00:08:41.191 [2024-10-27 21:33:42.735948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.736055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.191 [2024-10-27 21:33:42.736075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.736193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446468122060783615 len:1286 00:08:41.191 [2024-10-27 21:33:42.736214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.191 #6 NEW cov: 12437 ft: 13884 corp: 5/139b lim: 50 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:41.191 [2024-10-27 21:33:42.795424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289643270236341509 len:1286 00:08:41.191 [2024-10-27 21:33:42.795455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.795563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:41.191 [2024-10-27 21:33:42.795584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.191 #12 NEW cov: 12437 ft: 13928 corp: 6/165b lim: 50 exec/s: 0 rss: 72Mb L: 26/48 MS: 1 ChangeBit- 00:08:41.191 [2024-10-27 21:33:42.855822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289637751203366149 len:1 00:08:41.191 [2024-10-27 21:33:42.855853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.855957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:41.191 [2024-10-27 21:33:42.855987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.856106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864106168320 len:1286 00:08:41.191 [2024-10-27 21:33:42.856130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.191 [2024-10-27 21:33:42.856252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:361700864190383365 len:1286 00:08:41.191 [2024-10-27 21:33:42.856272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.191 #13 NEW cov: 12437 ft: 14031 corp: 7/209b lim: 50 exec/s: 0 rss: 72Mb L: 44/48 MS: 1 InsertRepeatedBytes- 00:08:41.450 [2024-10-27 21:33:42.925847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:41.450 [2024-10-27 21:33:42.925877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:42.925997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700868333697787 len:1286 00:08:41.450 [2024-10-27 21:33:42.926019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:42.926130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190385413 len:1286 00:08:41.450 [2024-10-27 21:33:42.926151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.450 #14 NEW cov: 12437 ft: 14176 corp: 8/241b lim: 50 exec/s: 0 rss: 72Mb L: 32/48 MS: 1 ChangeBit- 00:08:41.450 [2024-10-27 21:33:42.975919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289637751203366149 len:1 00:08:41.450 [2024-10-27 21:33:42.975952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:42.976039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:41.450 [2024-10-27 21:33:42.976063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:42.976178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864106168320 len:1286 00:08:41.450 [2024-10-27 21:33:42.976202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:42.976323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:361700864190383365 len:1286 00:08:41.450 [2024-10-27 21:33:42.976345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.450 #15 NEW cov: 12437 ft: 14235 corp: 9/289b lim: 50 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:41.450 [2024-10-27 21:33:43.035721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361821810553324805 len:1286 00:08:41.450 [2024-10-27 21:33:43.035754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:43.035867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18087868997536840442 len:1286 00:08:41.450 [2024-10-27 21:33:43.035889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:43.036006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383373 len:1286 00:08:41.450 [2024-10-27 21:33:43.036026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.450 #16 NEW cov: 12437 ft: 14305 corp: 10/322b lim: 50 exec/s: 0 rss: 72Mb L: 33/48 MS: 1 InsertByte- 00:08:41.450 [2024-10-27 21:33:43.105806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2229134482628835351 len:2566 00:08:41.450 [2024-10-27 21:33:43.105839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:43.105947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:41.450 [2024-10-27 21:33:43.105975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.450 [2024-10-27 21:33:43.106090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:41.450 [2024-10-27 21:33:43.106114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.450 #17 NEW cov: 12437 ft: 14335 corp: 11/356b lim: 50 exec/s: 0 rss: 72Mb L: 34/48 MS: 1 CMP- DE: "a\252x\027\036\357z\000"- 00:08:41.450 [2024-10-27 21:33:43.145899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289637751203366149 len:1 00:08:41.450 [2024-10-27 21:33:43.145935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.451 [2024-10-27 21:33:43.146023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:41.451 [2024-10-27 21:33:43.146047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.451 [2024-10-27 21:33:43.146166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361701748869431296 len:1286 00:08:41.451 [2024-10-27 21:33:43.146191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.710 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:41.710 #23 NEW cov: 12460 ft: 14379 corp: 12/388b lim: 50 exec/s: 0 rss: 73Mb L: 32/48 MS: 1 EraseBytes- 00:08:41.710 [2024-10-27 21:33:43.206210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289637751203366149 len:1 00:08:41.710 [2024-10-27 21:33:43.206239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.206366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:41.710 [2024-10-27 21:33:43.206389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.206494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864106168320 len:1286 00:08:41.710 [2024-10-27 21:33:43.206516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.206630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:361700864190383365 len:1286 00:08:41.710 [2024-10-27 21:33:43.206654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.710 #24 NEW cov: 12460 ft: 14435 corp: 13/436b lim: 50 exec/s: 0 rss: 73Mb L: 48/48 MS: 1 CopyPart- 00:08:41.710 [2024-10-27 21:33:43.255961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361701916541256965 len:64252 00:08:41.710 [2024-10-27 21:33:43.255995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.256097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18087868997520655731 len:1286 00:08:41.710 [2024-10-27 21:33:43.256118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.256231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383373 len:1286 00:08:41.710 [2024-10-27 21:33:43.256258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.710 #25 NEW cov: 12460 ft: 14476 corp: 14/469b lim: 50 exec/s: 25 rss: 73Mb L: 33/48 MS: 1 ShuffleBytes- 00:08:41.710 [2024-10-27 21:33:43.315931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:41.710 [2024-10-27 21:33:43.315971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.316059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700868333697787 len:1286 00:08:41.710 [2024-10-27 21:33:43.316079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.316192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:41.710 [2024-10-27 21:33:43.316216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.710 #26 NEW cov: 12460 ft: 14577 corp: 15/501b lim: 50 exec/s: 26 rss: 73Mb L: 32/48 MS: 1 CrossOver- 00:08:41.710 [2024-10-27 21:33:43.356223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7161677109476418403 len:25444 00:08:41.710 [2024-10-27 21:33:43.356255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.356366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361699766261998341 len:1 00:08:41.710 [2024-10-27 21:33:43.356391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.356508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:41.710 [2024-10-27 21:33:43.356533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.710 [2024-10-27 21:33:43.356648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5519032975360 len:1492 00:08:41.710 [2024-10-27 21:33:43.356667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.710 #27 NEW cov: 12460 ft: 14653 corp: 16/545b lim: 50 exec/s: 27 rss: 73Mb L: 44/48 MS: 1 InsertRepeatedBytes- 00:08:41.710 [2024-10-27 21:33:43.415697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:41.710 [2024-10-27 21:33:43.415730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.969 #28 NEW cov: 12460 ft: 15000 corp: 17/558b lim: 50 exec/s: 28 rss: 73Mb L: 13/48 MS: 1 EraseBytes- 00:08:41.969 [2024-10-27 21:33:43.456023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:41.969 [2024-10-27 21:33:43.456053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.456145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:41.969 [2024-10-27 21:33:43.456167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.456271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:41.969 [2024-10-27 21:33:43.456292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.969 #29 NEW cov: 12460 ft: 15015 corp: 18/588b lim: 50 exec/s: 29 rss: 73Mb L: 30/48 MS: 1 EraseBytes- 00:08:41.969 [2024-10-27 21:33:43.496187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289637751203366149 len:1 00:08:41.969 [2024-10-27 21:33:43.496216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.496301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:41.969 [2024-10-27 21:33:43.496323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.496432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:7037569407899402240 len:7920 00:08:41.969 [2024-10-27 21:33:43.496457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.496570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:361700866152989957 len:1286 00:08:41.969 [2024-10-27 21:33:43.496592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.969 #30 NEW cov: 12460 ft: 15029 corp: 19/636b lim: 50 exec/s: 30 rss: 73Mb L: 48/48 MS: 1 PersAutoDict- DE: "a\252x\027\036\357z\000"- 00:08:41.969 [2024-10-27 21:33:43.546019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361701916541256965 len:64252 00:08:41.969 [2024-10-27 21:33:43.546053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.546137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190446341 len:1286 00:08:41.969 [2024-10-27 21:33:43.546162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.546272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383373 len:1286 00:08:41.969 [2024-10-27 21:33:43.546294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.969 #31 NEW cov: 12460 ft: 15097 corp: 20/669b lim: 50 exec/s: 31 rss: 73Mb L: 33/48 MS: 1 CopyPart- 00:08:41.969 [2024-10-27 21:33:43.606124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:41.969 [2024-10-27 21:33:43.606158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.606261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:41.969 [2024-10-27 21:33:43.606286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.606398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:41.969 [2024-10-27 21:33:43.606422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.969 #32 NEW cov: 12460 ft: 15111 corp: 21/699b lim: 50 exec/s: 32 rss: 73Mb L: 30/48 MS: 1 ShuffleBytes- 00:08:41.969 [2024-10-27 21:33:43.676352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7161677109476418403 len:25444 00:08:41.969 [2024-10-27 21:33:43.676388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.676482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361699766262019077 len:1 00:08:41.969 [2024-10-27 21:33:43.676508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.676616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:41.969 [2024-10-27 21:33:43.676638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.969 [2024-10-27 21:33:43.676750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5519032975360 len:1492 00:08:41.969 [2024-10-27 21:33:43.676772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.228 #33 NEW cov: 12460 ft: 15137 corp: 22/743b lim: 50 exec/s: 33 rss: 73Mb L: 44/48 MS: 1 ChangeByte- 00:08:42.228 [2024-10-27 21:33:43.746337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289637751203366149 len:1 00:08:42.228 [2024-10-27 21:33:43.746371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.228 [2024-10-27 21:33:43.746447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:42.229 [2024-10-27 21:33:43.746470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.746594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:1412893916528645 len:1286 00:08:42.229 [2024-10-27 21:33:43.746619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.746737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:361700864190383365 len:1286 00:08:42.229 [2024-10-27 21:33:43.746763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.229 #34 NEW cov: 12460 ft: 15145 corp: 23/791b lim: 50 exec/s: 34 rss: 73Mb L: 48/48 MS: 1 ShuffleBytes- 00:08:42.229 [2024-10-27 21:33:43.786348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7161677109476418403 len:25444 00:08:42.229 [2024-10-27 21:33:43.786381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.786473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361699766261998341 len:1 00:08:42.229 [2024-10-27 21:33:43.786495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.786606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:42.229 [2024-10-27 21:33:43.786630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.786740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:27027116797198336 len:1492 00:08:42.229 [2024-10-27 21:33:43.786762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.229 #35 NEW cov: 12460 ft: 15164 corp: 24/835b lim: 50 exec/s: 35 rss: 73Mb L: 44/48 MS: 1 ChangeByte- 00:08:42.229 [2024-10-27 21:33:43.836243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2229134482628835351 len:2566 00:08:42.229 [2024-10-27 21:33:43.836279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.836400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:370708063445124357 len:1286 00:08:42.229 [2024-10-27 21:33:43.836422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.836528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:42.229 [2024-10-27 21:33:43.836553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.229 #36 NEW cov: 12460 ft: 15205 corp: 25/870b lim: 50 exec/s: 36 rss: 73Mb L: 35/48 MS: 1 InsertByte- 00:08:42.229 [2024-10-27 21:33:43.906223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:42.229 [2024-10-27 21:33:43.906257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.906350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:42.229 [2024-10-27 21:33:43.906375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.229 [2024-10-27 21:33:43.906493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:42.229 [2024-10-27 21:33:43.906517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.229 #37 NEW cov: 12460 ft: 15224 corp: 26/902b lim: 50 exec/s: 37 rss: 73Mb L: 32/48 MS: 1 CrossOver- 00:08:42.488 [2024-10-27 21:33:43.956278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:289643270236341509 len:1286 00:08:42.488 [2024-10-27 21:33:43.956311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:43.956425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:42.488 [2024-10-27 21:33:43.956450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.488 #38 NEW cov: 12460 ft: 15248 corp: 27/925b lim: 50 exec/s: 38 rss: 73Mb L: 23/48 MS: 1 EraseBytes- 00:08:42.488 [2024-10-27 21:33:43.996364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2229134482628835351 len:2566 00:08:42.488 [2024-10-27 21:33:43.996399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:43.996522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:370708063445124357 len:1286 00:08:42.488 [2024-10-27 21:33:43.996543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:43.996653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:42.488 [2024-10-27 21:33:43.996674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.488 #39 NEW cov: 12460 ft: 15265 corp: 28/960b lim: 50 exec/s: 39 rss: 73Mb L: 35/48 MS: 1 ChangeBit- 00:08:42.488 [2024-10-27 21:33:44.056516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2955774249639675141 len:1286 00:08:42.488 [2024-10-27 21:33:44.056551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.056646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18087868997536840442 len:1286 00:08:42.488 [2024-10-27 21:33:44.056668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.056784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744069515247615 len:65536 00:08:42.488 [2024-10-27 21:33:44.056808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.056925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446742999967727615 len:1286 00:08:42.488 [2024-10-27 21:33:44.056948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.488 #40 NEW cov: 12460 ft: 15303 corp: 29/1009b lim: 50 exec/s: 40 rss: 74Mb L: 49/49 MS: 1 InsertByte- 00:08:42.488 [2024-10-27 21:33:44.126538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:361700864274269445 len:1286 00:08:42.488 [2024-10-27 21:33:44.126571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.126677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 00:08:42.488 [2024-10-27 21:33:44.126697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.126815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 00:08:42.488 [2024-10-27 21:33:44.126838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.126954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:361700864190383365 len:1286 00:08:42.488 [2024-10-27 21:33:44.126988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.488 #41 NEW cov: 12460 ft: 15310 corp: 30/1053b lim: 50 exec/s: 41 rss: 74Mb L: 44/49 MS: 1 CopyPart- 00:08:42.488 [2024-10-27 21:33:44.196369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7161677109476418403 len:25444 00:08:42.488 [2024-10-27 21:33:44.196401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.196507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361695346740650757 len:1 00:08:42.488 [2024-10-27 21:33:44.196531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.488 [2024-10-27 21:33:44.196652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:361701748869431392 len:1286 00:08:42.488 [2024-10-27 21:33:44.196673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.748 #42 NEW cov: 12460 ft: 15343 corp: 31/1085b lim: 50 exec/s: 42 rss: 74Mb L: 32/49 MS: 1 EraseBytes- 00:08:42.748 [2024-10-27 21:33:44.266596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:362826764181112069 len:1286 00:08:42.748 [2024-10-27 21:33:44.266629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.748 [2024-10-27 21:33:44.266725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:361700868333697787 len:1286 00:08:42.748 [2024-10-27 21:33:44.266750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.748 [2024-10-27 21:33:44.266865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:42.748 [2024-10-27 21:33:44.266889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.748 [2024-10-27 21:33:44.267025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446468122060783615 len:1286 00:08:42.748 [2024-10-27 21:33:44.267050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.748 #43 NEW cov: 12460 ft: 15351 corp: 32/1133b lim: 50 exec/s: 21 rss: 74Mb L: 48/49 MS: 1 ChangeBinInt- 00:08:42.748 #43 DONE cov: 12460 ft: 15351 corp: 32/1133b lim: 50 exec/s: 21 rss: 74Mb 00:08:42.748 ###### Recommended dictionary. ###### 00:08:42.748 "a\252x\027\036\357z\000" # Uses: 1 00:08:42.748 ###### End of recommended dictionary. ###### 00:08:42.748 Done 43 runs in 2 second(s) 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:42.748 21:33:44 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:42.748 [2024-10-27 21:33:44.434698] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:42.748 [2024-10-27 21:33:44.434764] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3330682 ] 00:08:43.315 [2024-10-27 21:33:44.749524] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:43.315 [2024-10-27 21:33:44.796685] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.315 [2024-10-27 21:33:44.817542] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.315 [2024-10-27 21:33:44.870261] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.315 [2024-10-27 21:33:44.886567] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:43.315 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.315 INFO: Seed: 2383215347 00:08:43.315 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:43.315 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:43.315 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:43.315 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.315 #2 INITED exec/s: 0 rss: 64Mb 00:08:43.315 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.315 This may also happen if the target rejected all inputs we tried so far 00:08:43.316 [2024-10-27 21:33:44.935594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.316 [2024-10-27 21:33:44.935624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.316 [2024-10-27 21:33:44.935658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.316 [2024-10-27 21:33:44.935673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.316 [2024-10-27 21:33:44.935728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.316 [2024-10-27 21:33:44.935743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.574 NEW_FUNC[1/717]: 0x481088 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:43.574 NEW_FUNC[2/717]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.574 #8 NEW cov: 12291 ft: 12284 corp: 2/57b lim: 90 exec/s: 0 rss: 72Mb L: 56/56 MS: 1 InsertRepeatedBytes- 00:08:43.574 [2024-10-27 21:33:45.265752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.574 [2024-10-27 21:33:45.265787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.574 [2024-10-27 21:33:45.265843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.574 [2024-10-27 21:33:45.265860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.574 [2024-10-27 21:33:45.265916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.574 [2024-10-27 21:33:45.265933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.833 #9 NEW cov: 12404 ft: 12877 corp: 3/113b lim: 90 exec/s: 0 rss: 72Mb L: 56/56 MS: 1 CopyPart- 00:08:43.833 [2024-10-27 21:33:45.325687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.833 [2024-10-27 21:33:45.325715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.833 [2024-10-27 21:33:45.325749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.833 [2024-10-27 21:33:45.325765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.833 [2024-10-27 21:33:45.325822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.833 [2024-10-27 21:33:45.325838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.833 #10 NEW cov: 12410 ft: 13234 corp: 4/169b lim: 90 exec/s: 0 rss: 72Mb L: 56/56 MS: 1 ChangeBit- 00:08:43.833 [2024-10-27 21:33:45.385697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.833 [2024-10-27 21:33:45.385725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.833 [2024-10-27 21:33:45.385787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.833 [2024-10-27 21:33:45.385804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.833 [2024-10-27 21:33:45.385859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.833 [2024-10-27 21:33:45.385875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.833 #11 NEW cov: 12495 ft: 13563 corp: 5/234b lim: 90 exec/s: 0 rss: 72Mb L: 65/65 MS: 1 CopyPart- 00:08:43.834 [2024-10-27 21:33:45.425730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.834 [2024-10-27 21:33:45.425759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.425801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.834 [2024-10-27 21:33:45.425817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.425873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.834 [2024-10-27 21:33:45.425888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.834 #12 NEW cov: 12495 ft: 13722 corp: 6/290b lim: 90 exec/s: 0 rss: 72Mb L: 56/65 MS: 1 ChangeBinInt- 00:08:43.834 [2024-10-27 21:33:45.465720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.834 [2024-10-27 21:33:45.465748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.465788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.834 [2024-10-27 21:33:45.465803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.465860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.834 [2024-10-27 21:33:45.465875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.834 #14 NEW cov: 12495 ft: 13826 corp: 7/344b lim: 90 exec/s: 0 rss: 72Mb L: 54/65 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:43.834 [2024-10-27 21:33:45.505962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.834 [2024-10-27 21:33:45.505991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.506040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.834 [2024-10-27 21:33:45.506066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.506121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.834 [2024-10-27 21:33:45.506151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.834 [2024-10-27 21:33:45.506207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.834 [2024-10-27 21:33:45.506222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.834 #15 NEW cov: 12495 ft: 14244 corp: 8/430b lim: 90 exec/s: 0 rss: 72Mb L: 86/86 MS: 1 CrossOver- 00:08:44.093 [2024-10-27 21:33:45.565814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.093 [2024-10-27 21:33:45.565845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.565896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.093 [2024-10-27 21:33:45.565912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.565971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.093 [2024-10-27 21:33:45.565988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.093 #16 NEW cov: 12495 ft: 14312 corp: 9/485b lim: 90 exec/s: 0 rss: 72Mb L: 55/86 MS: 1 InsertByte- 00:08:44.093 [2024-10-27 21:33:45.625946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.093 [2024-10-27 21:33:45.625975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.626029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.093 [2024-10-27 21:33:45.626044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.626099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.093 [2024-10-27 21:33:45.626113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.626168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.093 [2024-10-27 21:33:45.626183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.093 #17 NEW cov: 12495 ft: 14333 corp: 10/573b lim: 90 exec/s: 0 rss: 72Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:44.093 [2024-10-27 21:33:45.665990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.093 [2024-10-27 21:33:45.666017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.666069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.093 [2024-10-27 21:33:45.666084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.666137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.093 [2024-10-27 21:33:45.666152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.666207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.093 [2024-10-27 21:33:45.666223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.093 #18 NEW cov: 12495 ft: 14362 corp: 11/659b lim: 90 exec/s: 0 rss: 72Mb L: 86/88 MS: 1 ChangeBit- 00:08:44.093 [2024-10-27 21:33:45.726006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.093 [2024-10-27 21:33:45.726033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.726086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.093 [2024-10-27 21:33:45.726101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.726157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.093 [2024-10-27 21:33:45.726172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.726227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.093 [2024-10-27 21:33:45.726242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.093 #19 NEW cov: 12495 ft: 14377 corp: 12/745b lim: 90 exec/s: 0 rss: 73Mb L: 86/88 MS: 1 CrossOver- 00:08:44.093 [2024-10-27 21:33:45.786041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.093 [2024-10-27 21:33:45.786069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.786119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.093 [2024-10-27 21:33:45.786134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.786190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.093 [2024-10-27 21:33:45.786206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.093 [2024-10-27 21:33:45.786262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.093 [2024-10-27 21:33:45.786278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.093 #20 NEW cov: 12495 ft: 14390 corp: 13/817b lim: 90 exec/s: 0 rss: 73Mb L: 72/88 MS: 1 CopyPart- 00:08:44.353 [2024-10-27 21:33:45.825906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.353 [2024-10-27 21:33:45.825934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.826004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.353 [2024-10-27 21:33:45.826020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.826077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.353 [2024-10-27 21:33:45.826092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.353 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:44.353 #21 NEW cov: 12518 ft: 14465 corp: 14/873b lim: 90 exec/s: 0 rss: 73Mb L: 56/88 MS: 1 ChangeBinInt- 00:08:44.353 [2024-10-27 21:33:45.866126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.353 [2024-10-27 21:33:45.866156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.866219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.353 [2024-10-27 21:33:45.866235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.866288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.353 [2024-10-27 21:33:45.866303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.866357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.353 [2024-10-27 21:33:45.866377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.353 #22 NEW cov: 12518 ft: 14495 corp: 15/959b lim: 90 exec/s: 0 rss: 73Mb L: 86/88 MS: 1 ChangeBit- 00:08:44.353 [2024-10-27 21:33:45.926140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.353 [2024-10-27 21:33:45.926168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.926219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.353 [2024-10-27 21:33:45.926235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.926288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.353 [2024-10-27 21:33:45.926304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.926360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.353 [2024-10-27 21:33:45.926375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.353 #23 NEW cov: 12518 ft: 14507 corp: 16/1046b lim: 90 exec/s: 23 rss: 73Mb L: 87/88 MS: 1 CopyPart- 00:08:44.353 [2024-10-27 21:33:45.986026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.353 [2024-10-27 21:33:45.986053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.986092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.353 [2024-10-27 21:33:45.986107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:45.986165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.353 [2024-10-27 21:33:45.986181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.353 #24 NEW cov: 12518 ft: 14529 corp: 17/1102b lim: 90 exec/s: 24 rss: 73Mb L: 56/88 MS: 1 CrossOver- 00:08:44.353 [2024-10-27 21:33:46.026151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.353 [2024-10-27 21:33:46.026178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:46.026227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.353 [2024-10-27 21:33:46.026242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:46.026296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.353 [2024-10-27 21:33:46.026312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.353 [2024-10-27 21:33:46.026367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.353 [2024-10-27 21:33:46.026381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.353 #25 NEW cov: 12518 ft: 14542 corp: 18/1175b lim: 90 exec/s: 25 rss: 73Mb L: 73/88 MS: 1 InsertByte- 00:08:44.613 [2024-10-27 21:33:46.086373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.613 [2024-10-27 21:33:46.086401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.086457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.613 [2024-10-27 21:33:46.086474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.086528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.613 [2024-10-27 21:33:46.086544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.086598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.613 [2024-10-27 21:33:46.086613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.086671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:44.613 [2024-10-27 21:33:46.086686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:44.613 #26 NEW cov: 12518 ft: 14610 corp: 19/1265b lim: 90 exec/s: 26 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:44.613 [2024-10-27 21:33:46.126270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.613 [2024-10-27 21:33:46.126299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.126346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.613 [2024-10-27 21:33:46.126362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.126418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.613 [2024-10-27 21:33:46.126433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.126490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.613 [2024-10-27 21:33:46.126506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.613 #27 NEW cov: 12518 ft: 14637 corp: 20/1343b lim: 90 exec/s: 27 rss: 73Mb L: 78/90 MS: 1 CopyPart- 00:08:44.613 [2024-10-27 21:33:46.166102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.613 [2024-10-27 21:33:46.166130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.166178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.613 [2024-10-27 21:33:46.166193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.166249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.613 [2024-10-27 21:33:46.166263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.613 #28 NEW cov: 12518 ft: 14676 corp: 21/1399b lim: 90 exec/s: 28 rss: 73Mb L: 56/90 MS: 1 ChangeByte- 00:08:44.613 [2024-10-27 21:33:46.226102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.613 [2024-10-27 21:33:46.226129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.226167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.613 [2024-10-27 21:33:46.226182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.226242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.613 [2024-10-27 21:33:46.226259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.613 #29 NEW cov: 12518 ft: 14715 corp: 22/1462b lim: 90 exec/s: 29 rss: 73Mb L: 63/90 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\177"- 00:08:44.613 [2024-10-27 21:33:46.286288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.613 [2024-10-27 21:33:46.286316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.286363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.613 [2024-10-27 21:33:46.286379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.286432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.613 [2024-10-27 21:33:46.286446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.613 [2024-10-27 21:33:46.286500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.613 [2024-10-27 21:33:46.286514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.613 #30 NEW cov: 12518 ft: 14739 corp: 23/1540b lim: 90 exec/s: 30 rss: 73Mb L: 78/90 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\177"- 00:08:44.872 [2024-10-27 21:33:46.346137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.872 [2024-10-27 21:33:46.346164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.872 [2024-10-27 21:33:46.346211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.872 [2024-10-27 21:33:46.346227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.872 [2024-10-27 21:33:46.346281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.872 [2024-10-27 21:33:46.346311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.872 #31 NEW cov: 12518 ft: 14752 corp: 24/1596b lim: 90 exec/s: 31 rss: 73Mb L: 56/90 MS: 1 ShuffleBytes- 00:08:44.872 [2024-10-27 21:33:46.386363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.872 [2024-10-27 21:33:46.386390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.872 [2024-10-27 21:33:46.386439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.872 [2024-10-27 21:33:46.386455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.872 [2024-10-27 21:33:46.386511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.872 [2024-10-27 21:33:46.386526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.872 [2024-10-27 21:33:46.386581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.872 [2024-10-27 21:33:46.386595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.872 #32 NEW cov: 12518 ft: 14767 corp: 25/1674b lim: 90 exec/s: 32 rss: 73Mb L: 78/90 MS: 1 InsertRepeatedBytes- 00:08:44.872 [2024-10-27 21:33:46.426226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.873 [2024-10-27 21:33:46.426253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.426297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.873 [2024-10-27 21:33:46.426312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.426367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.873 [2024-10-27 21:33:46.426382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.873 #33 NEW cov: 12518 ft: 14798 corp: 26/1737b lim: 90 exec/s: 33 rss: 73Mb L: 63/90 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\177"- 00:08:44.873 [2024-10-27 21:33:46.466353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.873 [2024-10-27 21:33:46.466380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.466446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.873 [2024-10-27 21:33:46.466462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.466517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.873 [2024-10-27 21:33:46.466532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.466588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.873 [2024-10-27 21:33:46.466604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.873 #34 NEW cov: 12518 ft: 14799 corp: 27/1809b lim: 90 exec/s: 34 rss: 73Mb L: 72/90 MS: 1 CopyPart- 00:08:44.873 [2024-10-27 21:33:46.506413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.873 [2024-10-27 21:33:46.506442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.506489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.873 [2024-10-27 21:33:46.506505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.506558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.873 [2024-10-27 21:33:46.506573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.506631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.873 [2024-10-27 21:33:46.506646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.873 #35 NEW cov: 12518 ft: 14803 corp: 28/1888b lim: 90 exec/s: 35 rss: 73Mb L: 79/90 MS: 1 InsertByte- 00:08:44.873 [2024-10-27 21:33:46.546250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.873 [2024-10-27 21:33:46.546277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.546323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.873 [2024-10-27 21:33:46.546339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.546398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.873 [2024-10-27 21:33:46.546414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.873 #36 NEW cov: 12518 ft: 14853 corp: 29/1944b lim: 90 exec/s: 36 rss: 73Mb L: 56/90 MS: 1 ChangeByte- 00:08:44.873 [2024-10-27 21:33:46.586239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.873 [2024-10-27 21:33:46.586267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.586316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.873 [2024-10-27 21:33:46.586333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.873 [2024-10-27 21:33:46.586390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.873 [2024-10-27 21:33:46.586405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.132 #37 NEW cov: 12518 ft: 14923 corp: 30/2007b lim: 90 exec/s: 37 rss: 73Mb L: 63/90 MS: 1 ChangeBit- 00:08:45.132 [2024-10-27 21:33:46.646471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.132 [2024-10-27 21:33:46.646499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.646551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.132 [2024-10-27 21:33:46.646566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.646620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.132 [2024-10-27 21:33:46.646651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.646707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:45.132 [2024-10-27 21:33:46.646723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.132 #38 NEW cov: 12518 ft: 14935 corp: 31/2093b lim: 90 exec/s: 38 rss: 74Mb L: 86/90 MS: 1 ShuffleBytes- 00:08:45.132 [2024-10-27 21:33:46.706502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.132 [2024-10-27 21:33:46.706531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.706577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.132 [2024-10-27 21:33:46.706592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.706646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.132 [2024-10-27 21:33:46.706660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.706713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:45.132 [2024-10-27 21:33:46.706728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.132 #39 NEW cov: 12518 ft: 14996 corp: 32/2181b lim: 90 exec/s: 39 rss: 74Mb L: 88/90 MS: 1 CopyPart- 00:08:45.132 [2024-10-27 21:33:46.766496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.132 [2024-10-27 21:33:46.766526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.766566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.132 [2024-10-27 21:33:46.766581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.766635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.132 [2024-10-27 21:33:46.766651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.766705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:45.132 [2024-10-27 21:33:46.766719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.132 #40 NEW cov: 12518 ft: 15004 corp: 33/2267b lim: 90 exec/s: 40 rss: 74Mb L: 86/90 MS: 1 ChangeBinInt- 00:08:45.132 [2024-10-27 21:33:46.806227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.132 [2024-10-27 21:33:46.806254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.806295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.132 [2024-10-27 21:33:46.806311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.132 #41 NEW cov: 12518 ft: 15322 corp: 34/2306b lim: 90 exec/s: 41 rss: 74Mb L: 39/90 MS: 1 EraseBytes- 00:08:45.132 [2024-10-27 21:33:46.846715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.132 [2024-10-27 21:33:46.846743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.846799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.132 [2024-10-27 21:33:46.846814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.846868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.132 [2024-10-27 21:33:46.846883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.846934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:45.132 [2024-10-27 21:33:46.846953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.132 [2024-10-27 21:33:46.847010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:45.132 [2024-10-27 21:33:46.847025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.392 #42 NEW cov: 12518 ft: 15339 corp: 35/2396b lim: 90 exec/s: 42 rss: 74Mb L: 90/90 MS: 1 CopyPart- 00:08:45.392 [2024-10-27 21:33:46.906536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:45.392 [2024-10-27 21:33:46.906564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.392 [2024-10-27 21:33:46.906616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:45.392 [2024-10-27 21:33:46.906631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.392 [2024-10-27 21:33:46.906703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:45.392 [2024-10-27 21:33:46.906718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.392 [2024-10-27 21:33:46.906774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:45.392 [2024-10-27 21:33:46.906789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.392 #43 NEW cov: 12518 ft: 15372 corp: 36/2469b lim: 90 exec/s: 21 rss: 74Mb L: 73/90 MS: 1 CopyPart- 00:08:45.392 #43 DONE cov: 12518 ft: 15372 corp: 36/2469b lim: 90 exec/s: 21 rss: 74Mb 00:08:45.392 ###### Recommended dictionary. ###### 00:08:45.392 "\377\377\377\377\377\377\377\177" # Uses: 2 00:08:45.392 ###### End of recommended dictionary. ###### 00:08:45.392 Done 43 runs in 2 second(s) 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:45.392 21:33:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:45.392 [2024-10-27 21:33:47.074619] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:45.392 [2024-10-27 21:33:47.074694] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3331153 ] 00:08:45.961 [2024-10-27 21:33:47.393663] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:45.961 [2024-10-27 21:33:47.440052] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.961 [2024-10-27 21:33:47.457285] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.961 [2024-10-27 21:33:47.509562] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.961 [2024-10-27 21:33:47.525868] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:45.961 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.961 INFO: Seed: 729246346 00:08:45.961 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:45.961 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:45.961 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.961 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.961 #2 INITED exec/s: 0 rss: 64Mb 00:08:45.961 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.961 This may also happen if the target rejected all inputs we tried so far 00:08:45.961 [2024-10-27 21:33:47.571012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.961 [2024-10-27 21:33:47.571042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.219 NEW_FUNC[1/717]: 0x4842b8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:46.219 NEW_FUNC[2/717]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:46.219 #5 NEW cov: 12265 ft: 12263 corp: 2/13b lim: 50 exec/s: 0 rss: 72Mb L: 12/12 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:46.219 [2024-10-27 21:33:47.881107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.219 [2024-10-27 21:33:47.881139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.219 #11 NEW cov: 12379 ft: 12892 corp: 3/31b lim: 50 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 CopyPart- 00:08:46.219 [2024-10-27 21:33:47.941417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.219 [2024-10-27 21:33:47.941446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.219 [2024-10-27 21:33:47.941485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.219 [2024-10-27 21:33:47.941501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.219 [2024-10-27 21:33:47.941560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.220 [2024-10-27 21:33:47.941574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.478 #12 NEW cov: 12385 ft: 13973 corp: 4/61b lim: 50 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:46.478 [2024-10-27 21:33:47.981105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.478 [2024-10-27 21:33:47.981132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.478 #15 NEW cov: 12470 ft: 14264 corp: 5/73b lim: 50 exec/s: 0 rss: 72Mb L: 12/30 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:46.478 [2024-10-27 21:33:48.021271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.478 [2024-10-27 21:33:48.021299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.478 [2024-10-27 21:33:48.021365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.478 [2024-10-27 21:33:48.021382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.478 #24 NEW cov: 12470 ft: 14597 corp: 6/97b lim: 50 exec/s: 0 rss: 72Mb L: 24/30 MS: 4 CopyPart-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:46.478 [2024-10-27 21:33:48.061152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.478 [2024-10-27 21:33:48.061183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.478 #25 NEW cov: 12470 ft: 14663 corp: 7/111b lim: 50 exec/s: 0 rss: 72Mb L: 14/30 MS: 1 EraseBytes- 00:08:46.478 [2024-10-27 21:33:48.121119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.478 [2024-10-27 21:33:48.121147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.478 #26 NEW cov: 12470 ft: 14743 corp: 8/123b lim: 50 exec/s: 0 rss: 72Mb L: 12/30 MS: 1 ChangeBinInt- 00:08:46.478 [2024-10-27 21:33:48.161608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.478 [2024-10-27 21:33:48.161636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.478 [2024-10-27 21:33:48.161705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.478 [2024-10-27 21:33:48.161721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.478 [2024-10-27 21:33:48.161779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.478 [2024-10-27 21:33:48.161794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.478 [2024-10-27 21:33:48.161853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.478 [2024-10-27 21:33:48.161869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.478 #32 NEW cov: 12470 ft: 15133 corp: 9/165b lim: 50 exec/s: 0 rss: 72Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:08:46.478 [2024-10-27 21:33:48.201173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.478 [2024-10-27 21:33:48.201217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.737 #38 NEW cov: 12470 ft: 15158 corp: 10/183b lim: 50 exec/s: 0 rss: 72Mb L: 18/42 MS: 1 EraseBytes- 00:08:46.737 [2024-10-27 21:33:48.261193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.737 [2024-10-27 21:33:48.261221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.737 #39 NEW cov: 12470 ft: 15214 corp: 11/194b lim: 50 exec/s: 0 rss: 72Mb L: 11/42 MS: 1 EraseBytes- 00:08:46.737 [2024-10-27 21:33:48.321496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.737 [2024-10-27 21:33:48.321524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.737 [2024-10-27 21:33:48.321578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.737 [2024-10-27 21:33:48.321594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.737 [2024-10-27 21:33:48.321652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.737 [2024-10-27 21:33:48.321668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.737 #40 NEW cov: 12470 ft: 15230 corp: 12/232b lim: 50 exec/s: 0 rss: 72Mb L: 38/42 MS: 1 InsertRepeatedBytes- 00:08:46.737 [2024-10-27 21:33:48.381520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.737 [2024-10-27 21:33:48.381548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.737 [2024-10-27 21:33:48.381586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.737 [2024-10-27 21:33:48.381605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.737 [2024-10-27 21:33:48.381659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.737 [2024-10-27 21:33:48.381673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.737 #41 NEW cov: 12470 ft: 15247 corp: 13/262b lim: 50 exec/s: 0 rss: 72Mb L: 30/42 MS: 1 ChangeBit- 00:08:46.737 [2024-10-27 21:33:48.441233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.737 [2024-10-27 21:33:48.441261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.996 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:46.996 #42 NEW cov: 12493 ft: 15340 corp: 14/276b lim: 50 exec/s: 0 rss: 73Mb L: 14/42 MS: 1 ShuffleBytes- 00:08:46.996 [2024-10-27 21:33:48.501583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.996 [2024-10-27 21:33:48.501612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.501652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.996 [2024-10-27 21:33:48.501667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.501725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.996 [2024-10-27 21:33:48.501741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.996 #43 NEW cov: 12493 ft: 15354 corp: 15/306b lim: 50 exec/s: 0 rss: 73Mb L: 30/42 MS: 1 ChangeBit- 00:08:46.996 [2024-10-27 21:33:48.561641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.996 [2024-10-27 21:33:48.561669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.561709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.996 [2024-10-27 21:33:48.561725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.561786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.996 [2024-10-27 21:33:48.561801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.996 #44 NEW cov: 12493 ft: 15373 corp: 16/338b lim: 50 exec/s: 44 rss: 73Mb L: 32/42 MS: 1 CMP- DE: "\000\000"- 00:08:46.996 [2024-10-27 21:33:48.621818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.996 [2024-10-27 21:33:48.621846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.621915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.996 [2024-10-27 21:33:48.621931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.621991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.996 [2024-10-27 21:33:48.622006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.996 [2024-10-27 21:33:48.622063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.996 [2024-10-27 21:33:48.622082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.996 #45 NEW cov: 12493 ft: 15376 corp: 17/380b lim: 50 exec/s: 45 rss: 73Mb L: 42/42 MS: 1 ChangeBinInt- 00:08:46.996 [2024-10-27 21:33:48.681351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.996 [2024-10-27 21:33:48.681379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.996 #46 NEW cov: 12493 ft: 15438 corp: 18/398b lim: 50 exec/s: 46 rss: 73Mb L: 18/42 MS: 1 ChangeBit- 00:08:47.255 [2024-10-27 21:33:48.721919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.255 [2024-10-27 21:33:48.721952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.255 [2024-10-27 21:33:48.722011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.255 [2024-10-27 21:33:48.722027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.255 [2024-10-27 21:33:48.722084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.255 [2024-10-27 21:33:48.722099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.255 [2024-10-27 21:33:48.722159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.255 [2024-10-27 21:33:48.722175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.255 #47 NEW cov: 12493 ft: 15472 corp: 19/440b lim: 50 exec/s: 47 rss: 73Mb L: 42/42 MS: 1 CopyPart- 00:08:47.255 [2024-10-27 21:33:48.781387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.255 [2024-10-27 21:33:48.781415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.255 #48 NEW cov: 12493 ft: 15516 corp: 20/451b lim: 50 exec/s: 48 rss: 73Mb L: 11/42 MS: 1 EraseBytes- 00:08:47.255 [2024-10-27 21:33:48.841782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.255 [2024-10-27 21:33:48.841810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.255 [2024-10-27 21:33:48.841860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.255 [2024-10-27 21:33:48.841876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.255 [2024-10-27 21:33:48.841935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.255 [2024-10-27 21:33:48.841955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.255 #49 NEW cov: 12493 ft: 15533 corp: 21/483b lim: 50 exec/s: 49 rss: 73Mb L: 32/42 MS: 1 InsertRepeatedBytes- 00:08:47.255 [2024-10-27 21:33:48.881442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.255 [2024-10-27 21:33:48.881470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.255 #53 NEW cov: 12493 ft: 15544 corp: 22/493b lim: 50 exec/s: 53 rss: 73Mb L: 10/42 MS: 4 ChangeBit-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:08:47.255 [2024-10-27 21:33:48.921526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.255 [2024-10-27 21:33:48.921554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.255 #58 NEW cov: 12493 ft: 15616 corp: 23/508b lim: 50 exec/s: 58 rss: 73Mb L: 15/42 MS: 5 ChangeBinInt-InsertByte-ChangeBit-CrossOver-InsertRepeatedBytes- 00:08:47.255 [2024-10-27 21:33:48.961653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.255 [2024-10-27 21:33:48.961681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.255 [2024-10-27 21:33:48.961718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.255 [2024-10-27 21:33:48.961734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.514 #59 NEW cov: 12493 ft: 15634 corp: 24/536b lim: 50 exec/s: 59 rss: 73Mb L: 28/42 MS: 1 CopyPart- 00:08:47.514 [2024-10-27 21:33:49.002025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.514 [2024-10-27 21:33:49.002052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.002107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.514 [2024-10-27 21:33:49.002123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.002194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.514 [2024-10-27 21:33:49.002211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.002271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.514 [2024-10-27 21:33:49.002286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.042014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.514 [2024-10-27 21:33:49.042042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.042097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.514 [2024-10-27 21:33:49.042113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.042171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.514 [2024-10-27 21:33:49.042186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.042245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.514 [2024-10-27 21:33:49.042261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.514 #61 NEW cov: 12493 ft: 15641 corp: 25/585b lim: 50 exec/s: 61 rss: 73Mb L: 49/49 MS: 2 InsertRepeatedBytes-PersAutoDict- DE: "\000\000"- 00:08:47.514 [2024-10-27 21:33:49.081565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.514 [2024-10-27 21:33:49.081594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.514 #62 NEW cov: 12493 ft: 15661 corp: 26/601b lim: 50 exec/s: 62 rss: 73Mb L: 16/49 MS: 1 EraseBytes- 00:08:47.514 [2024-10-27 21:33:49.121959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.514 [2024-10-27 21:33:49.121986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.122030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.514 [2024-10-27 21:33:49.122046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.122104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.514 [2024-10-27 21:33:49.122119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.514 #63 NEW cov: 12493 ft: 15700 corp: 27/634b lim: 50 exec/s: 63 rss: 73Mb L: 33/49 MS: 1 InsertByte- 00:08:47.514 [2024-10-27 21:33:49.182116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.514 [2024-10-27 21:33:49.182143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.182196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.514 [2024-10-27 21:33:49.182212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.182268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.514 [2024-10-27 21:33:49.182283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.514 [2024-10-27 21:33:49.182340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.514 [2024-10-27 21:33:49.182356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.514 #64 NEW cov: 12493 ft: 15739 corp: 28/680b lim: 50 exec/s: 64 rss: 73Mb L: 46/49 MS: 1 InsertRepeatedBytes- 00:08:47.774 [2024-10-27 21:33:49.242252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.774 [2024-10-27 21:33:49.242280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.242332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.774 [2024-10-27 21:33:49.242347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.242405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.774 [2024-10-27 21:33:49.242419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.242477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.774 [2024-10-27 21:33:49.242493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.774 #65 NEW cov: 12493 ft: 15772 corp: 29/722b lim: 50 exec/s: 65 rss: 73Mb L: 42/49 MS: 1 ChangeByte- 00:08:47.774 [2024-10-27 21:33:49.281681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.774 [2024-10-27 21:33:49.281709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.774 #66 NEW cov: 12493 ft: 15806 corp: 30/733b lim: 50 exec/s: 66 rss: 74Mb L: 11/49 MS: 1 ChangeByte- 00:08:47.774 [2024-10-27 21:33:49.342113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.774 [2024-10-27 21:33:49.342140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.342180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.774 [2024-10-27 21:33:49.342199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.342255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.774 [2024-10-27 21:33:49.342270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.774 #67 NEW cov: 12493 ft: 15817 corp: 31/772b lim: 50 exec/s: 67 rss: 74Mb L: 39/49 MS: 1 CrossOver- 00:08:47.774 [2024-10-27 21:33:49.401932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.774 [2024-10-27 21:33:49.401963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.402000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.774 [2024-10-27 21:33:49.402016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.774 #68 NEW cov: 12493 ft: 15827 corp: 32/796b lim: 50 exec/s: 68 rss: 74Mb L: 24/49 MS: 1 ChangeBit- 00:08:47.774 [2024-10-27 21:33:49.441791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.774 [2024-10-27 21:33:49.441818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.774 #69 NEW cov: 12493 ft: 15838 corp: 33/810b lim: 50 exec/s: 69 rss: 74Mb L: 14/49 MS: 1 ChangeByte- 00:08:47.774 [2024-10-27 21:33:49.482161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.774 [2024-10-27 21:33:49.482188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.482224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.774 [2024-10-27 21:33:49.482240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.774 [2024-10-27 21:33:49.482297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.774 [2024-10-27 21:33:49.482314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.040 #71 NEW cov: 12493 ft: 15922 corp: 34/840b lim: 50 exec/s: 71 rss: 74Mb L: 30/49 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:48.040 [2024-10-27 21:33:49.522119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.040 [2024-10-27 21:33:49.522146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.040 [2024-10-27 21:33:49.522189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.040 [2024-10-27 21:33:49.522205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.040 [2024-10-27 21:33:49.522262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.040 [2024-10-27 21:33:49.522278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.040 #72 NEW cov: 12493 ft: 15940 corp: 35/872b lim: 50 exec/s: 72 rss: 74Mb L: 32/49 MS: 1 ChangeBit- 00:08:48.040 [2024-10-27 21:33:49.562342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:48.040 [2024-10-27 21:33:49.562370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.040 [2024-10-27 21:33:49.562423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:48.040 [2024-10-27 21:33:49.562439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.040 [2024-10-27 21:33:49.562512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:48.040 [2024-10-27 21:33:49.562527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.040 [2024-10-27 21:33:49.562585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:48.040 [2024-10-27 21:33:49.562600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.040 #73 NEW cov: 12493 ft: 15997 corp: 36/912b lim: 50 exec/s: 36 rss: 74Mb L: 40/49 MS: 1 InsertRepeatedBytes- 00:08:48.040 #73 DONE cov: 12493 ft: 15997 corp: 36/912b lim: 50 exec/s: 36 rss: 74Mb 00:08:48.040 ###### Recommended dictionary. ###### 00:08:48.040 "\000\000" # Uses: 1 00:08:48.040 ###### End of recommended dictionary. ###### 00:08:48.040 Done 73 runs in 2 second(s) 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:48.040 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:48.041 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:48.041 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:48.041 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:48.041 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:48.041 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:48.041 21:33:49 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:48.041 [2024-10-27 21:33:49.732265] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:48.041 [2024-10-27 21:33:49.732341] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3331501 ] 00:08:48.610 [2024-10-27 21:33:50.051750] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:48.610 [2024-10-27 21:33:50.099593] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.610 [2024-10-27 21:33:50.115104] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.610 [2024-10-27 21:33:50.168004] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:48.610 [2024-10-27 21:33:50.184339] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:48.610 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.610 INFO: Seed: 3387273673 00:08:48.610 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:48.610 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:48.610 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:48.610 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.610 #2 INITED exec/s: 0 rss: 65Mb 00:08:48.610 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.610 This may also happen if the target rejected all inputs we tried so far 00:08:48.610 [2024-10-27 21:33:50.239010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.611 [2024-10-27 21:33:50.239046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.869 NEW_FUNC[1/717]: 0x486588 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:48.869 NEW_FUNC[2/717]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.869 #44 NEW cov: 12267 ft: 12288 corp: 2/34b lim: 85 exec/s: 0 rss: 72Mb L: 33/33 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:48.869 [2024-10-27 21:33:50.589107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.869 [2024-10-27 21:33:50.589147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.869 [2024-10-27 21:33:50.589183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.869 [2024-10-27 21:33:50.589201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.129 #45 NEW cov: 12405 ft: 13710 corp: 3/69b lim: 85 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CMP- DE: "\030\000"- 00:08:49.129 [2024-10-27 21:33:50.689090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.129 [2024-10-27 21:33:50.689136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.129 [2024-10-27 21:33:50.689170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.129 [2024-10-27 21:33:50.689189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.129 [2024-10-27 21:33:50.689220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.129 [2024-10-27 21:33:50.689237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.129 #46 NEW cov: 12411 ft: 14311 corp: 4/123b lim: 85 exec/s: 0 rss: 72Mb L: 54/54 MS: 1 CopyPart- 00:08:49.129 [2024-10-27 21:33:50.788919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.129 [2024-10-27 21:33:50.788954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.129 #47 NEW cov: 12496 ft: 14691 corp: 5/149b lim: 85 exec/s: 0 rss: 73Mb L: 26/54 MS: 1 CrossOver- 00:08:49.129 [2024-10-27 21:33:50.849027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.129 [2024-10-27 21:33:50.849057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.129 [2024-10-27 21:33:50.849105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.129 [2024-10-27 21:33:50.849127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.129 [2024-10-27 21:33:50.849158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.129 [2024-10-27 21:33:50.849174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.388 #48 NEW cov: 12496 ft: 14805 corp: 6/203b lim: 85 exec/s: 0 rss: 73Mb L: 54/54 MS: 1 CrossOver- 00:08:49.388 [2024-10-27 21:33:50.939004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.388 [2024-10-27 21:33:50.939034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.388 [2024-10-27 21:33:50.939084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.388 [2024-10-27 21:33:50.939117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.388 #49 NEW cov: 12496 ft: 14859 corp: 7/244b lim: 85 exec/s: 0 rss: 73Mb L: 41/54 MS: 1 CrossOver- 00:08:49.388 [2024-10-27 21:33:51.029201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.388 [2024-10-27 21:33:51.029232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.388 [2024-10-27 21:33:51.029283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.388 [2024-10-27 21:33:51.029301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.388 [2024-10-27 21:33:51.029332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.388 [2024-10-27 21:33:51.029348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.388 [2024-10-27 21:33:51.029378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.388 [2024-10-27 21:33:51.029395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.388 #50 NEW cov: 12496 ft: 15278 corp: 8/312b lim: 85 exec/s: 0 rss: 73Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:08:49.388 [2024-10-27 21:33:51.089090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.388 [2024-10-27 21:33:51.089121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.388 [2024-10-27 21:33:51.089155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.388 [2024-10-27 21:33:51.089172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.646 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:49.646 #51 NEW cov: 12519 ft: 15346 corp: 9/353b lim: 85 exec/s: 0 rss: 73Mb L: 41/68 MS: 1 ChangeBinInt- 00:08:49.647 [2024-10-27 21:33:51.189012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-10-27 21:33:51.189042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 #52 NEW cov: 12519 ft: 15382 corp: 10/386b lim: 85 exec/s: 52 rss: 73Mb L: 33/68 MS: 1 ChangeBit- 00:08:49.647 [2024-10-27 21:33:51.249024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-10-27 21:33:51.249055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 #54 NEW cov: 12519 ft: 15423 corp: 11/409b lim: 85 exec/s: 54 rss: 73Mb L: 23/68 MS: 2 CopyPart-CrossOver- 00:08:49.647 [2024-10-27 21:33:51.299216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.647 [2024-10-27 21:33:51.299248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.647 [2024-10-27 21:33:51.299297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.647 [2024-10-27 21:33:51.299314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.647 [2024-10-27 21:33:51.299344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.647 [2024-10-27 21:33:51.299360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.647 [2024-10-27 21:33:51.299389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.647 [2024-10-27 21:33:51.299404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.906 #55 NEW cov: 12519 ft: 15463 corp: 12/488b lim: 85 exec/s: 55 rss: 73Mb L: 79/79 MS: 1 CopyPart- 00:08:49.906 [2024-10-27 21:33:51.389060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.906 [2024-10-27 21:33:51.389091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.906 #56 NEW cov: 12519 ft: 15474 corp: 13/512b lim: 85 exec/s: 56 rss: 73Mb L: 24/79 MS: 1 InsertByte- 00:08:49.906 [2024-10-27 21:33:51.479144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.906 [2024-10-27 21:33:51.479174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.906 [2024-10-27 21:33:51.479222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.906 [2024-10-27 21:33:51.479239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.906 #57 NEW cov: 12519 ft: 15491 corp: 14/550b lim: 85 exec/s: 57 rss: 73Mb L: 38/79 MS: 1 InsertRepeatedBytes- 00:08:49.906 [2024-10-27 21:33:51.539175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.906 [2024-10-27 21:33:51.539204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.906 [2024-10-27 21:33:51.539253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.906 [2024-10-27 21:33:51.539271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.906 #58 NEW cov: 12519 ft: 15516 corp: 15/592b lim: 85 exec/s: 58 rss: 73Mb L: 42/79 MS: 1 InsertByte- 00:08:49.906 [2024-10-27 21:33:51.599113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.906 [2024-10-27 21:33:51.599143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.164 #62 NEW cov: 12519 ft: 15535 corp: 16/611b lim: 85 exec/s: 62 rss: 73Mb L: 19/79 MS: 4 EraseBytes-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:50.164 [2024-10-27 21:33:51.649187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.164 [2024-10-27 21:33:51.649216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.164 [2024-10-27 21:33:51.649264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.164 [2024-10-27 21:33:51.649282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.164 #63 NEW cov: 12519 ft: 15583 corp: 17/652b lim: 85 exec/s: 63 rss: 73Mb L: 41/79 MS: 1 ShuffleBytes- 00:08:50.164 [2024-10-27 21:33:51.709137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.164 [2024-10-27 21:33:51.709166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.164 #64 NEW cov: 12519 ft: 15622 corp: 18/683b lim: 85 exec/s: 64 rss: 73Mb L: 31/79 MS: 1 EraseBytes- 00:08:50.164 [2024-10-27 21:33:51.769330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.164 [2024-10-27 21:33:51.769360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.164 [2024-10-27 21:33:51.769408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.164 [2024-10-27 21:33:51.769425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.164 [2024-10-27 21:33:51.769455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.164 [2024-10-27 21:33:51.769471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.164 [2024-10-27 21:33:51.769500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.165 [2024-10-27 21:33:51.769517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.165 #65 NEW cov: 12519 ft: 15643 corp: 19/751b lim: 85 exec/s: 65 rss: 73Mb L: 68/79 MS: 1 ShuffleBytes- 00:08:50.165 [2024-10-27 21:33:51.859170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.165 [2024-10-27 21:33:51.859200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.428 #66 NEW cov: 12519 ft: 15659 corp: 20/784b lim: 85 exec/s: 66 rss: 73Mb L: 33/79 MS: 1 ChangeByte- 00:08:50.428 [2024-10-27 21:33:51.909247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.428 [2024-10-27 21:33:51.909276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.428 [2024-10-27 21:33:51.909325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.428 [2024-10-27 21:33:51.909342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.428 #67 NEW cov: 12519 ft: 15675 corp: 21/827b lim: 85 exec/s: 67 rss: 73Mb L: 43/79 MS: 1 PersAutoDict- DE: "\030\000"- 00:08:50.428 [2024-10-27 21:33:51.999408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.428 [2024-10-27 21:33:51.999438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.428 [2024-10-27 21:33:51.999486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.428 [2024-10-27 21:33:51.999503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.428 [2024-10-27 21:33:51.999532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:50.428 [2024-10-27 21:33:51.999549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.428 [2024-10-27 21:33:51.999578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:50.429 [2024-10-27 21:33:51.999594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.429 #68 NEW cov: 12519 ft: 15709 corp: 22/896b lim: 85 exec/s: 68 rss: 73Mb L: 69/79 MS: 1 InsertByte- 00:08:50.429 [2024-10-27 21:33:52.059286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.429 [2024-10-27 21:33:52.059315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.429 [2024-10-27 21:33:52.059363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:50.429 [2024-10-27 21:33:52.059380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.429 #69 NEW cov: 12519 ft: 15723 corp: 23/939b lim: 85 exec/s: 69 rss: 74Mb L: 43/79 MS: 1 ChangeBit- 00:08:50.429 [2024-10-27 21:33:52.149298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:50.429 [2024-10-27 21:33:52.149329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.692 #70 NEW cov: 12519 ft: 15740 corp: 24/972b lim: 85 exec/s: 35 rss: 74Mb L: 33/79 MS: 1 CopyPart- 00:08:50.692 #70 DONE cov: 12519 ft: 15740 corp: 24/972b lim: 85 exec/s: 35 rss: 74Mb 00:08:50.692 ###### Recommended dictionary. ###### 00:08:50.692 "\030\000" # Uses: 1 00:08:50.692 ###### End of recommended dictionary. ###### 00:08:50.692 Done 70 runs in 2 second(s) 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.692 21:33:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:50.692 [2024-10-27 21:33:52.371116] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:50.692 [2024-10-27 21:33:52.371210] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3332041 ] 00:08:51.260 [2024-10-27 21:33:52.683196] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:51.260 [2024-10-27 21:33:52.728634] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.260 [2024-10-27 21:33:52.746214] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.260 [2024-10-27 21:33:52.798416] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:51.260 [2024-10-27 21:33:52.814721] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:51.260 INFO: Running with entropic power schedule (0xFF, 100). 00:08:51.260 INFO: Seed: 1723285964 00:08:51.260 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:51.260 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:51.260 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:51.260 INFO: A corpus is not provided, starting from an empty corpus 00:08:51.260 #2 INITED exec/s: 0 rss: 64Mb 00:08:51.260 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:51.260 This may also happen if the target rejected all inputs we tried so far 00:08:51.260 [2024-10-27 21:33:52.882196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.260 [2024-10-27 21:33:52.882238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.260 [2024-10-27 21:33:52.882315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.260 [2024-10-27 21:33:52.882330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.260 [2024-10-27 21:33:52.882405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.260 [2024-10-27 21:33:52.882426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.260 [2024-10-27 21:33:52.882502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.260 [2024-10-27 21:33:52.882522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.518 NEW_FUNC[1/716]: 0x4897c8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:51.518 NEW_FUNC[2/716]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:51.518 #4 NEW cov: 12221 ft: 12223 corp: 2/24b lim: 25 exec/s: 0 rss: 72Mb L: 23/23 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:51.518 [2024-10-27 21:33:53.231267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.518 [2024-10-27 21:33:53.231312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.518 [2024-10-27 21:33:53.231445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.518 [2024-10-27 21:33:53.231473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.518 [2024-10-27 21:33:53.231594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.518 [2024-10-27 21:33:53.231620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.518 [2024-10-27 21:33:53.231752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.518 [2024-10-27 21:33:53.231787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.777 #5 NEW cov: 12338 ft: 12826 corp: 3/48b lim: 25 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 InsertByte- 00:08:51.777 [2024-10-27 21:33:53.301194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.777 [2024-10-27 21:33:53.301227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.301313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.777 [2024-10-27 21:33:53.301335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.301447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.777 [2024-10-27 21:33:53.301471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.301588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.777 [2024-10-27 21:33:53.301607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.777 #6 NEW cov: 12344 ft: 13010 corp: 4/72b lim: 25 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 ChangeByte- 00:08:51.777 [2024-10-27 21:33:53.371322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.777 [2024-10-27 21:33:53.371354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.371467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.777 [2024-10-27 21:33:53.371490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.371612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.777 [2024-10-27 21:33:53.371636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.371764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.777 [2024-10-27 21:33:53.371786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.777 #7 NEW cov: 12429 ft: 13381 corp: 5/95b lim: 25 exec/s: 0 rss: 72Mb L: 23/24 MS: 1 ShuffleBytes- 00:08:51.777 [2024-10-27 21:33:53.421430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.777 [2024-10-27 21:33:53.421464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.421571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.777 [2024-10-27 21:33:53.421590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.421711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.777 [2024-10-27 21:33:53.421735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.421850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.777 [2024-10-27 21:33:53.421872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.777 [2024-10-27 21:33:53.422007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.777 [2024-10-27 21:33:53.422029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.777 #8 NEW cov: 12429 ft: 13535 corp: 6/120b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CopyPart- 00:08:51.777 [2024-10-27 21:33:53.470673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.778 [2024-10-27 21:33:53.470705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.778 #9 NEW cov: 12429 ft: 14286 corp: 7/125b lim: 25 exec/s: 0 rss: 72Mb L: 5/25 MS: 1 CrossOver- 00:08:52.036 [2024-10-27 21:33:53.521513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.036 [2024-10-27 21:33:53.521544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.521630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.036 [2024-10-27 21:33:53.521651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.521768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.036 [2024-10-27 21:33:53.521787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.521903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.036 [2024-10-27 21:33:53.521925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.522049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.036 [2024-10-27 21:33:53.522072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.036 #10 NEW cov: 12429 ft: 14374 corp: 8/150b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CrossOver- 00:08:52.036 [2024-10-27 21:33:53.591146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.036 [2024-10-27 21:33:53.591177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.591276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.036 [2024-10-27 21:33:53.591297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.591425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.036 [2024-10-27 21:33:53.591449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.036 #11 NEW cov: 12429 ft: 14617 corp: 9/168b lim: 25 exec/s: 0 rss: 72Mb L: 18/25 MS: 1 InsertRepeatedBytes- 00:08:52.036 [2024-10-27 21:33:53.631420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.036 [2024-10-27 21:33:53.631453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.631553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.036 [2024-10-27 21:33:53.631574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.631692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.036 [2024-10-27 21:33:53.631715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.036 [2024-10-27 21:33:53.631835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.036 [2024-10-27 21:33:53.631857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.036 #12 NEW cov: 12429 ft: 14645 corp: 10/192b lim: 25 exec/s: 0 rss: 72Mb L: 24/25 MS: 1 CMP- DE: "\377y\357)\224\206Z\014"- 00:08:52.036 [2024-10-27 21:33:53.700803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.036 [2024-10-27 21:33:53.700834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.036 #13 NEW cov: 12429 ft: 14779 corp: 11/201b lim: 25 exec/s: 0 rss: 72Mb L: 9/25 MS: 1 CMP- DE: "\262-\304I$\357z\000"- 00:08:52.036 [2024-10-27 21:33:53.750858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.037 [2024-10-27 21:33:53.750890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.295 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:52.295 #14 NEW cov: 12452 ft: 14863 corp: 12/207b lim: 25 exec/s: 0 rss: 73Mb L: 6/25 MS: 1 InsertByte- 00:08:52.295 [2024-10-27 21:33:53.821547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.295 [2024-10-27 21:33:53.821581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.821665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.295 [2024-10-27 21:33:53.821686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.821817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.295 [2024-10-27 21:33:53.821835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.821968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.295 [2024-10-27 21:33:53.821993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.295 #15 NEW cov: 12452 ft: 14886 corp: 13/231b lim: 25 exec/s: 0 rss: 73Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:52.295 [2024-10-27 21:33:53.871777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.295 [2024-10-27 21:33:53.871809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.871893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.295 [2024-10-27 21:33:53.871914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.872040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.295 [2024-10-27 21:33:53.872064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.872184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.295 [2024-10-27 21:33:53.872209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.295 [2024-10-27 21:33:53.872334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.295 [2024-10-27 21:33:53.872358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.295 #16 NEW cov: 12452 ft: 14905 corp: 14/256b lim: 25 exec/s: 16 rss: 73Mb L: 25/25 MS: 1 InsertByte- 00:08:52.295 [2024-10-27 21:33:53.921783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.295 [2024-10-27 21:33:53.921818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.921918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.296 [2024-10-27 21:33:53.921945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.922065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.296 [2024-10-27 21:33:53.922088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.922212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.296 [2024-10-27 21:33:53.922232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.922359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.296 [2024-10-27 21:33:53.922381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.296 #17 NEW cov: 12452 ft: 14924 corp: 15/281b lim: 25 exec/s: 17 rss: 73Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:52.296 [2024-10-27 21:33:53.991613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.296 [2024-10-27 21:33:53.991645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.991744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.296 [2024-10-27 21:33:53.991764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.991885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.296 [2024-10-27 21:33:53.991905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.296 [2024-10-27 21:33:53.992030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.296 [2024-10-27 21:33:53.992053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.296 #20 NEW cov: 12452 ft: 14959 corp: 16/302b lim: 25 exec/s: 20 rss: 73Mb L: 21/25 MS: 3 CrossOver-ChangeByte-CrossOver- 00:08:52.554 [2024-10-27 21:33:54.041679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.554 [2024-10-27 21:33:54.041715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.041837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.554 [2024-10-27 21:33:54.041860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.041987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.554 [2024-10-27 21:33:54.042011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.042132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.554 [2024-10-27 21:33:54.042157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.554 #21 NEW cov: 12452 ft: 14991 corp: 17/326b lim: 25 exec/s: 21 rss: 73Mb L: 24/25 MS: 1 CMP- DE: "\357\207+q$\357z\000"- 00:08:52.554 [2024-10-27 21:33:54.111121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.554 [2024-10-27 21:33:54.111148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.554 #22 NEW cov: 12452 ft: 15015 corp: 18/332b lim: 25 exec/s: 22 rss: 73Mb L: 6/25 MS: 1 ShuffleBytes- 00:08:52.554 [2024-10-27 21:33:54.181507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.554 [2024-10-27 21:33:54.181543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.181654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.554 [2024-10-27 21:33:54.181675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.181790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.554 [2024-10-27 21:33:54.181814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.554 #23 NEW cov: 12452 ft: 15027 corp: 19/350b lim: 25 exec/s: 23 rss: 73Mb L: 18/25 MS: 1 CrossOver- 00:08:52.554 [2024-10-27 21:33:54.251750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.554 [2024-10-27 21:33:54.251780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.251885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.554 [2024-10-27 21:33:54.251906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.252036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.554 [2024-10-27 21:33:54.252059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.554 [2024-10-27 21:33:54.252178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.554 [2024-10-27 21:33:54.252201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.554 #24 NEW cov: 12452 ft: 15117 corp: 20/373b lim: 25 exec/s: 24 rss: 73Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:52.812 [2024-10-27 21:33:54.301981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.812 [2024-10-27 21:33:54.302016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.812 [2024-10-27 21:33:54.302097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.812 [2024-10-27 21:33:54.302120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.812 [2024-10-27 21:33:54.302243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.812 [2024-10-27 21:33:54.302265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.812 [2024-10-27 21:33:54.302385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.812 [2024-10-27 21:33:54.302406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.812 [2024-10-27 21:33:54.302530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.812 [2024-10-27 21:33:54.302553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.812 #25 NEW cov: 12452 ft: 15170 corp: 21/398b lim: 25 exec/s: 25 rss: 73Mb L: 25/25 MS: 1 CopyPart- 00:08:52.813 [2024-10-27 21:33:54.371280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.813 [2024-10-27 21:33:54.371307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.813 #26 NEW cov: 12452 ft: 15177 corp: 22/404b lim: 25 exec/s: 26 rss: 73Mb L: 6/25 MS: 1 ChangeByte- 00:08:52.813 [2024-10-27 21:33:54.442067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.813 [2024-10-27 21:33:54.442101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.442196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.813 [2024-10-27 21:33:54.442217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.442339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.813 [2024-10-27 21:33:54.442363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.442475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.813 [2024-10-27 21:33:54.442497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.442621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.813 [2024-10-27 21:33:54.442642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.813 #27 NEW cov: 12452 ft: 15203 corp: 23/429b lim: 25 exec/s: 27 rss: 73Mb L: 25/25 MS: 1 CopyPart- 00:08:52.813 [2024-10-27 21:33:54.512139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.813 [2024-10-27 21:33:54.512171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.512260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.813 [2024-10-27 21:33:54.512281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.512399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.813 [2024-10-27 21:33:54.512422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.512540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.813 [2024-10-27 21:33:54.512562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.813 [2024-10-27 21:33:54.512687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.813 [2024-10-27 21:33:54.512711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.813 #28 NEW cov: 12452 ft: 15214 corp: 24/454b lim: 25 exec/s: 28 rss: 73Mb L: 25/25 MS: 1 CrossOver- 00:08:53.071 [2024-10-27 21:33:54.561982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.071 [2024-10-27 21:33:54.562016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.071 [2024-10-27 21:33:54.562086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.071 [2024-10-27 21:33:54.562126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.071 [2024-10-27 21:33:54.562250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.071 [2024-10-27 21:33:54.562274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.071 [2024-10-27 21:33:54.562391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:53.071 [2024-10-27 21:33:54.562414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.071 #29 NEW cov: 12452 ft: 15233 corp: 25/477b lim: 25 exec/s: 29 rss: 73Mb L: 23/25 MS: 1 EraseBytes- 00:08:53.071 [2024-10-27 21:33:54.612222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.071 [2024-10-27 21:33:54.612256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.071 [2024-10-27 21:33:54.612344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.071 [2024-10-27 21:33:54.612362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.071 [2024-10-27 21:33:54.612477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.071 [2024-10-27 21:33:54.612497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.071 [2024-10-27 21:33:54.612612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:53.071 [2024-10-27 21:33:54.612634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.612754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:53.072 [2024-10-27 21:33:54.612773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.072 #30 NEW cov: 12452 ft: 15255 corp: 26/502b lim: 25 exec/s: 30 rss: 73Mb L: 25/25 MS: 1 ChangeBit- 00:08:53.072 [2024-10-27 21:33:54.682135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.072 [2024-10-27 21:33:54.682166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.682252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.072 [2024-10-27 21:33:54.682275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.682391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.072 [2024-10-27 21:33:54.682411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.682539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:53.072 [2024-10-27 21:33:54.682560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.072 #31 NEW cov: 12452 ft: 15258 corp: 27/526b lim: 25 exec/s: 31 rss: 74Mb L: 24/25 MS: 1 ChangeByte- 00:08:53.072 [2024-10-27 21:33:54.732114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.072 [2024-10-27 21:33:54.732146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.732243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.072 [2024-10-27 21:33:54.732270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.732394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.072 [2024-10-27 21:33:54.732413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.072 [2024-10-27 21:33:54.732539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:53.072 [2024-10-27 21:33:54.732563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.072 #32 NEW cov: 12452 ft: 15263 corp: 28/546b lim: 25 exec/s: 32 rss: 74Mb L: 20/25 MS: 1 EraseBytes- 00:08:53.331 [2024-10-27 21:33:54.801614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.331 [2024-10-27 21:33:54.801646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.331 #33 NEW cov: 12452 ft: 15302 corp: 29/555b lim: 25 exec/s: 33 rss: 74Mb L: 9/25 MS: 1 ShuffleBytes- 00:08:53.331 [2024-10-27 21:33:54.872209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:53.331 [2024-10-27 21:33:54.872240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.331 [2024-10-27 21:33:54.872323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:53.331 [2024-10-27 21:33:54.872346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.331 [2024-10-27 21:33:54.872464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:53.331 [2024-10-27 21:33:54.872489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.331 [2024-10-27 21:33:54.872613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:53.331 [2024-10-27 21:33:54.872634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.331 #34 NEW cov: 12452 ft: 15304 corp: 30/576b lim: 25 exec/s: 17 rss: 74Mb L: 21/25 MS: 1 ChangeBinInt- 00:08:53.331 #34 DONE cov: 12452 ft: 15304 corp: 30/576b lim: 25 exec/s: 17 rss: 74Mb 00:08:53.331 ###### Recommended dictionary. ###### 00:08:53.331 "\377y\357)\224\206Z\014" # Uses: 0 00:08:53.331 "\262-\304I$\357z\000" # Uses: 0 00:08:53.331 "\357\207+q$\357z\000" # Uses: 0 00:08:53.331 ###### End of recommended dictionary. ###### 00:08:53.331 Done 34 runs in 2 second(s) 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:53.331 21:33:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:53.331 [2024-10-27 21:33:55.056396] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:53.590 [2024-10-27 21:33:55.056462] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3332570 ] 00:08:53.848 [2024-10-27 21:33:55.369312] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:53.848 [2024-10-27 21:33:55.415054] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.848 [2024-10-27 21:33:55.432734] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.848 [2024-10-27 21:33:55.485034] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:53.848 [2024-10-27 21:33:55.501343] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:53.848 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.848 INFO: Seed: 115309117 00:08:53.848 INFO: Loaded 1 modules (386371 inline 8-bit counters): 386371 [0x2aa2e8c, 0x2b013cf), 00:08:53.848 INFO: Loaded 1 PC tables (386371 PCs): 386371 [0x2b013d0,0x30e6800), 00:08:53.848 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:53.848 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.848 #2 INITED exec/s: 0 rss: 64Mb 00:08:53.848 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.848 This may also happen if the target rejected all inputs we tried so far 00:08:53.848 [2024-10-27 21:33:55.556404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1946157056 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.848 [2024-10-27 21:33:55.556434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.365 NEW_FUNC[1/717]: 0x48a8b8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:54.365 NEW_FUNC[2/717]: 0x49b538 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:54.365 #15 NEW cov: 12297 ft: 12279 corp: 2/30b lim: 100 exec/s: 0 rss: 72Mb L: 29/29 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:08:54.365 [2024-10-27 21:33:55.866889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.365 [2024-10-27 21:33:55.866925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.365 [2024-10-27 21:33:55.866985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.365 [2024-10-27 21:33:55.867001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.365 [2024-10-27 21:33:55.867061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.365 [2024-10-27 21:33:55.867078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.365 #18 NEW cov: 12410 ft: 13556 corp: 3/98b lim: 100 exec/s: 0 rss: 72Mb L: 68/68 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:54.365 [2024-10-27 21:33:55.906637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.365 [2024-10-27 21:33:55.906667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.365 [2024-10-27 21:33:55.906707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.365 [2024-10-27 21:33:55.906723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.365 #19 NEW cov: 12416 ft: 14017 corp: 4/149b lim: 100 exec/s: 0 rss: 72Mb L: 51/68 MS: 1 EraseBytes- 00:08:54.365 [2024-10-27 21:33:55.967050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.365 [2024-10-27 21:33:55.967079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.366 [2024-10-27 21:33:55.967129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:55.967144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.366 [2024-10-27 21:33:55.967198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:55.967214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.366 [2024-10-27 21:33:55.967271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:55.967285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.366 #21 NEW cov: 12501 ft: 14736 corp: 5/231b lim: 100 exec/s: 0 rss: 72Mb L: 82/82 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:54.366 [2024-10-27 21:33:56.006701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:56.006729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.366 [2024-10-27 21:33:56.006784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:56.006800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.366 #22 NEW cov: 12501 ft: 14878 corp: 6/280b lim: 100 exec/s: 0 rss: 72Mb L: 49/82 MS: 1 InsertRepeatedBytes- 00:08:54.366 [2024-10-27 21:33:56.046683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:56.046711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.366 [2024-10-27 21:33:56.046769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15933 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.366 [2024-10-27 21:33:56.046784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.366 #28 NEW cov: 12501 ft: 14929 corp: 7/329b lim: 100 exec/s: 0 rss: 72Mb L: 49/82 MS: 1 ChangeBit- 00:08:54.644 [2024-10-27 21:33:56.106785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.106814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.106855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.106872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.644 #29 NEW cov: 12501 ft: 14992 corp: 8/378b lim: 100 exec/s: 0 rss: 72Mb L: 49/82 MS: 1 ShuffleBytes- 00:08:54.644 [2024-10-27 21:33:56.147083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.147111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.147164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.147179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.147234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.147266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.147323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.147338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.644 #30 NEW cov: 12501 ft: 15131 corp: 9/461b lim: 100 exec/s: 0 rss: 72Mb L: 83/83 MS: 1 InsertByte- 00:08:54.644 [2024-10-27 21:33:56.206656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073125573119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.206684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.644 #35 NEW cov: 12501 ft: 15254 corp: 10/494b lim: 100 exec/s: 0 rss: 72Mb L: 33/83 MS: 5 ChangeByte-ShuffleBytes-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:54.644 [2024-10-27 21:33:56.247130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.247157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.247209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.247226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.247279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.247295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.247359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551614 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.247374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.644 #36 NEW cov: 12501 ft: 15350 corp: 11/577b lim: 100 exec/s: 0 rss: 72Mb L: 83/83 MS: 1 ChangeBit- 00:08:54.644 [2024-10-27 21:33:56.306882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.306911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.306967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.306983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.644 #37 NEW cov: 12501 ft: 15365 corp: 12/626b lim: 100 exec/s: 0 rss: 72Mb L: 49/83 MS: 1 ChangeBinInt- 00:08:54.644 [2024-10-27 21:33:56.346914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.346948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.644 [2024-10-27 21:33:56.347002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.644 [2024-10-27 21:33:56.347018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.948 #38 NEW cov: 12501 ft: 15378 corp: 13/676b lim: 100 exec/s: 0 rss: 72Mb L: 50/83 MS: 1 EraseBytes- 00:08:54.948 [2024-10-27 21:33:56.406922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.406956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.407007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.407023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.948 NEW_FUNC[1/1]: 0x1c3fc18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.948 #39 NEW cov: 12524 ft: 15424 corp: 14/727b lim: 100 exec/s: 0 rss: 73Mb L: 51/83 MS: 1 CMP- DE: "\000\021"- 00:08:54.948 [2024-10-27 21:33:56.466957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.466985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.467022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.467037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.948 #40 NEW cov: 12524 ft: 15469 corp: 15/776b lim: 100 exec/s: 0 rss: 73Mb L: 49/83 MS: 1 ShuffleBytes- 00:08:54.948 [2024-10-27 21:33:56.506935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.506967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.507028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.507047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.948 #41 NEW cov: 12524 ft: 15491 corp: 16/820b lim: 100 exec/s: 0 rss: 73Mb L: 44/83 MS: 1 EraseBytes- 00:08:54.948 [2024-10-27 21:33:56.547322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.547349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.547405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.547421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.547476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.547491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.547546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551614 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.547561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.948 #42 NEW cov: 12524 ft: 15530 corp: 17/903b lim: 100 exec/s: 42 rss: 73Mb L: 83/83 MS: 1 CopyPart- 00:08:54.948 [2024-10-27 21:33:56.607047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.607075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.948 [2024-10-27 21:33:56.607115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.948 [2024-10-27 21:33:56.607131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.948 #43 NEW cov: 12524 ft: 15543 corp: 18/953b lim: 100 exec/s: 43 rss: 73Mb L: 50/83 MS: 1 ShuffleBytes- 00:08:55.235 [2024-10-27 21:33:56.667079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.667109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.667163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.667179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.235 #44 NEW cov: 12524 ft: 15568 corp: 19/1002b lim: 100 exec/s: 44 rss: 73Mb L: 49/83 MS: 1 ShuffleBytes- 00:08:55.235 [2024-10-27 21:33:56.727122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.727149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.727188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.727203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.235 #45 NEW cov: 12524 ft: 15597 corp: 20/1051b lim: 100 exec/s: 45 rss: 73Mb L: 49/83 MS: 1 ChangeByte- 00:08:55.235 [2024-10-27 21:33:56.767099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.767127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.767180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090714920566334 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.767196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.235 #46 NEW cov: 12524 ft: 15623 corp: 21/1103b lim: 100 exec/s: 46 rss: 73Mb L: 52/83 MS: 1 InsertByte- 00:08:55.235 [2024-10-27 21:33:56.827474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.827502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.827568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090714916487230 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.827584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.827640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4485090715960753726 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.827656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.827713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.827728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.235 #47 NEW cov: 12524 ft: 15693 corp: 22/1196b lim: 100 exec/s: 47 rss: 73Mb L: 93/93 MS: 1 CrossOver- 00:08:55.235 [2024-10-27 21:33:56.867132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.867161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.867201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.867216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.235 #48 NEW cov: 12524 ft: 15739 corp: 23/1244b lim: 100 exec/s: 48 rss: 73Mb L: 48/93 MS: 1 EraseBytes- 00:08:55.235 [2024-10-27 21:33:56.907158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.907186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.235 [2024-10-27 21:33:56.907225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:49471 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.235 [2024-10-27 21:33:56.907241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.500 #49 NEW cov: 12524 ft: 15743 corp: 24/1293b lim: 100 exec/s: 49 rss: 73Mb L: 49/93 MS: 1 ChangeBinInt- 00:08:55.500 [2024-10-27 21:33:56.967363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.500 [2024-10-27 21:33:56.967394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.500 [2024-10-27 21:33:56.967433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.500 [2024-10-27 21:33:56.967448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.500 [2024-10-27 21:33:56.967501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.500 [2024-10-27 21:33:56.967517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.500 #50 NEW cov: 12524 ft: 15753 corp: 25/1362b lim: 100 exec/s: 50 rss: 73Mb L: 69/93 MS: 1 InsertByte- 00:08:55.500 [2024-10-27 21:33:57.007033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.500 [2024-10-27 21:33:57.007062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.500 #51 NEW cov: 12524 ft: 15758 corp: 26/1386b lim: 100 exec/s: 51 rss: 73Mb L: 24/93 MS: 1 CrossOver- 00:08:55.500 [2024-10-27 21:33:57.047074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.500 [2024-10-27 21:33:57.047103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.500 #52 NEW cov: 12524 ft: 15766 corp: 27/1411b lim: 100 exec/s: 52 rss: 73Mb L: 25/93 MS: 1 InsertByte- 00:08:55.501 [2024-10-27 21:33:57.107406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.107434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.501 [2024-10-27 21:33:57.107482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4294967078 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.107497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.501 [2024-10-27 21:33:57.107554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.107569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.501 #53 NEW cov: 12524 ft: 15841 corp: 28/1482b lim: 100 exec/s: 53 rss: 73Mb L: 71/93 MS: 1 CrossOver- 00:08:55.501 [2024-10-27 21:33:57.167314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.167343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.501 [2024-10-27 21:33:57.167393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.167409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.501 #54 NEW cov: 12524 ft: 15877 corp: 29/1532b lim: 100 exec/s: 54 rss: 73Mb L: 50/93 MS: 1 ShuffleBytes- 00:08:55.501 [2024-10-27 21:33:57.207592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.207620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.501 [2024-10-27 21:33:57.207667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.207682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.501 [2024-10-27 21:33:57.207737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.207753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.501 [2024-10-27 21:33:57.207811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.501 [2024-10-27 21:33:57.207827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.760 #55 NEW cov: 12524 ft: 15895 corp: 30/1627b lim: 100 exec/s: 55 rss: 73Mb L: 95/95 MS: 1 InsertRepeatedBytes- 00:08:55.760 [2024-10-27 21:33:57.247189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073125573119 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.247217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.760 #56 NEW cov: 12524 ft: 15901 corp: 31/1660b lim: 100 exec/s: 56 rss: 74Mb L: 33/95 MS: 1 ChangeByte- 00:08:55.760 [2024-10-27 21:33:57.307221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446743240670445567 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.307248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.760 #57 NEW cov: 12524 ft: 15914 corp: 32/1697b lim: 100 exec/s: 57 rss: 74Mb L: 37/95 MS: 1 CrossOver- 00:08:55.760 [2024-10-27 21:33:57.347216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1946157056 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.347242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.760 #58 NEW cov: 12524 ft: 15953 corp: 33/1726b lim: 100 exec/s: 58 rss: 74Mb L: 29/95 MS: 1 ChangeByte- 00:08:55.760 [2024-10-27 21:33:57.407406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4485090660126178878 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.407435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.760 [2024-10-27 21:33:57.407478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4485090715960753726 len:15935 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.407494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.760 #59 NEW cov: 12524 ft: 15960 corp: 34/1775b lim: 100 exec/s: 59 rss: 74Mb L: 49/95 MS: 1 ChangeBinInt- 00:08:55.760 [2024-10-27 21:33:57.447738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.447768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.760 [2024-10-27 21:33:57.447817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.447834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.760 [2024-10-27 21:33:57.447891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18445618173802708991 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.447910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.760 [2024-10-27 21:33:57.447971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.760 [2024-10-27 21:33:57.447987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.760 #60 NEW cov: 12524 ft: 15994 corp: 35/1858b lim: 100 exec/s: 60 rss: 74Mb L: 83/95 MS: 1 ChangeBit- 00:08:56.019 [2024-10-27 21:33:57.487462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:57166 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.019 [2024-10-27 21:33:57.487491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.019 [2024-10-27 21:33:57.487546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.019 [2024-10-27 21:33:57.487562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.019 #61 NEW cov: 12524 ft: 16006 corp: 36/1908b lim: 100 exec/s: 61 rss: 74Mb L: 50/95 MS: 1 CMP- DE: "\000\000\000\000\002<\337M"- 00:08:56.019 [2024-10-27 21:33:57.547452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446743952867340593 len:58340 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.019 [2024-10-27 21:33:57.547480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.019 [2024-10-27 21:33:57.547518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16421219234243404771 len:58340 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:56.019 [2024-10-27 21:33:57.547534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.019 #63 NEW cov: 12524 ft: 16017 corp: 37/1960b lim: 100 exec/s: 31 rss: 74Mb L: 52/95 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:56.019 #63 DONE cov: 12524 ft: 16017 corp: 37/1960b lim: 100 exec/s: 31 rss: 74Mb 00:08:56.019 ###### Recommended dictionary. ###### 00:08:56.019 "\000\021" # Uses: 0 00:08:56.019 "\000\000\000\000\002<\337M" # Uses: 0 00:08:56.019 ###### End of recommended dictionary. ###### 00:08:56.019 Done 63 runs in 2 second(s) 00:08:56.019 21:33:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:56.019 21:33:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:56.019 21:33:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:56.019 21:33:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:56.019 00:08:56.019 real 1m6.950s 00:08:56.019 user 1m39.248s 00:08:56.019 sys 0m9.040s 00:08:56.019 21:33:57 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:56.019 21:33:57 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:56.019 ************************************ 00:08:56.019 END TEST nvmf_llvm_fuzz 00:08:56.019 ************************************ 00:08:56.019 21:33:57 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:56.019 21:33:57 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:56.019 21:33:57 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:56.019 21:33:57 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:56.019 21:33:57 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:56.019 21:33:57 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:56.278 ************************************ 00:08:56.278 START TEST vfio_llvm_fuzz 00:08:56.279 ************************************ 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:56.279 * Looking for test storage... 00:08:56.279 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1689 -- # lcov --version 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:08:56.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.279 --rc genhtml_branch_coverage=1 00:08:56.279 --rc genhtml_function_coverage=1 00:08:56.279 --rc genhtml_legend=1 00:08:56.279 --rc geninfo_all_blocks=1 00:08:56.279 --rc geninfo_unexecuted_blocks=1 00:08:56.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.279 ' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:08:56.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.279 --rc genhtml_branch_coverage=1 00:08:56.279 --rc genhtml_function_coverage=1 00:08:56.279 --rc genhtml_legend=1 00:08:56.279 --rc geninfo_all_blocks=1 00:08:56.279 --rc geninfo_unexecuted_blocks=1 00:08:56.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.279 ' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:08:56.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.279 --rc genhtml_branch_coverage=1 00:08:56.279 --rc genhtml_function_coverage=1 00:08:56.279 --rc genhtml_legend=1 00:08:56.279 --rc geninfo_all_blocks=1 00:08:56.279 --rc geninfo_unexecuted_blocks=1 00:08:56.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.279 ' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:08:56.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.279 --rc genhtml_branch_coverage=1 00:08:56.279 --rc genhtml_function_coverage=1 00:08:56.279 --rc genhtml_legend=1 00:08:56.279 --rc geninfo_all_blocks=1 00:08:56.279 --rc geninfo_unexecuted_blocks=1 00:08:56.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.279 ' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:56.279 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:56.280 21:33:57 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:56.280 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:56.280 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:56.541 #define SPDK_CONFIG_H 00:08:56.541 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:56.541 #define SPDK_CONFIG_APPS 1 00:08:56.541 #define SPDK_CONFIG_ARCH native 00:08:56.541 #undef SPDK_CONFIG_ASAN 00:08:56.541 #undef SPDK_CONFIG_AVAHI 00:08:56.541 #undef SPDK_CONFIG_CET 00:08:56.541 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:56.541 #define SPDK_CONFIG_COVERAGE 1 00:08:56.541 #define SPDK_CONFIG_CROSS_PREFIX 00:08:56.541 #undef SPDK_CONFIG_CRYPTO 00:08:56.541 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:56.541 #undef SPDK_CONFIG_CUSTOMOCF 00:08:56.541 #undef SPDK_CONFIG_DAOS 00:08:56.541 #define SPDK_CONFIG_DAOS_DIR 00:08:56.541 #define SPDK_CONFIG_DEBUG 1 00:08:56.541 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:56.541 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:56.541 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:56.541 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:56.541 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:56.541 #undef SPDK_CONFIG_DPDK_UADK 00:08:56.541 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:56.541 #define SPDK_CONFIG_EXAMPLES 1 00:08:56.541 #undef SPDK_CONFIG_FC 00:08:56.541 #define SPDK_CONFIG_FC_PATH 00:08:56.541 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:56.541 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:56.541 #define SPDK_CONFIG_FSDEV 1 00:08:56.541 #undef SPDK_CONFIG_FUSE 00:08:56.541 #define SPDK_CONFIG_FUZZER 1 00:08:56.541 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:56.541 #undef SPDK_CONFIG_GOLANG 00:08:56.541 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:56.541 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:56.541 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:56.541 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:56.541 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:56.541 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:56.541 #undef SPDK_CONFIG_HAVE_LZ4 00:08:56.541 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:56.541 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:56.541 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:56.541 #define SPDK_CONFIG_IDXD 1 00:08:56.541 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:56.541 #undef SPDK_CONFIG_IPSEC_MB 00:08:56.541 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:56.541 #define SPDK_CONFIG_ISAL 1 00:08:56.541 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:56.541 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:56.541 #define SPDK_CONFIG_LIBDIR 00:08:56.541 #undef SPDK_CONFIG_LTO 00:08:56.541 #define SPDK_CONFIG_MAX_LCORES 128 00:08:56.541 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:56.541 #define SPDK_CONFIG_NVME_CUSE 1 00:08:56.541 #undef SPDK_CONFIG_OCF 00:08:56.541 #define SPDK_CONFIG_OCF_PATH 00:08:56.541 #define SPDK_CONFIG_OPENSSL_PATH 00:08:56.541 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:56.541 #define SPDK_CONFIG_PGO_DIR 00:08:56.541 #undef SPDK_CONFIG_PGO_USE 00:08:56.541 #define SPDK_CONFIG_PREFIX /usr/local 00:08:56.541 #undef SPDK_CONFIG_RAID5F 00:08:56.541 #undef SPDK_CONFIG_RBD 00:08:56.541 #define SPDK_CONFIG_RDMA 1 00:08:56.541 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:56.541 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:56.541 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:56.541 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:56.541 #undef SPDK_CONFIG_SHARED 00:08:56.541 #undef SPDK_CONFIG_SMA 00:08:56.541 #define SPDK_CONFIG_TESTS 1 00:08:56.541 #undef SPDK_CONFIG_TSAN 00:08:56.541 #define SPDK_CONFIG_UBLK 1 00:08:56.541 #define SPDK_CONFIG_UBSAN 1 00:08:56.541 #undef SPDK_CONFIG_UNIT_TESTS 00:08:56.541 #undef SPDK_CONFIG_URING 00:08:56.541 #define SPDK_CONFIG_URING_PATH 00:08:56.541 #undef SPDK_CONFIG_URING_ZNS 00:08:56.541 #undef SPDK_CONFIG_USDT 00:08:56.541 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:56.541 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:56.541 #define SPDK_CONFIG_VFIO_USER 1 00:08:56.541 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:56.541 #define SPDK_CONFIG_VHOST 1 00:08:56.541 #define SPDK_CONFIG_VIRTIO 1 00:08:56.541 #undef SPDK_CONFIG_VTUNE 00:08:56.541 #define SPDK_CONFIG_VTUNE_DIR 00:08:56.541 #define SPDK_CONFIG_WERROR 1 00:08:56.541 #define SPDK_CONFIG_WPDK_DIR 00:08:56.541 #undef SPDK_CONFIG_XNVME 00:08:56.541 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:56.541 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : main 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:56.542 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 3333124 ]] 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 3333124 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1674 -- # set_test_storage 2147483648 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:08:56.543 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.bdzxHM 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.bdzxHM/tests/vfio /tmp/spdk.bdzxHM 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=607576064 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4676853760 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=51412680704 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730611200 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=10317930496 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30860541952 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4763648 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340129792 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5992448 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30864478208 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=827392 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173048832 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173061120 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:08:56.544 * Looking for test storage... 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=51412680704 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=12532523008 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:56.544 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1676 -- # set -o errtrace 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1677 -- # shopt -s extdebug 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # true 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1683 -- # xtrace_fd 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1688 -- # [[ y == y ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1689 -- # lcov --version 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1689 -- # awk '{print $NF}' 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1689 -- # lt 1.15 2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1690 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:56.544 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1702 -- # export 'LCOV_OPTS= 00:08:56.544 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.544 --rc genhtml_branch_coverage=1 00:08:56.544 --rc genhtml_function_coverage=1 00:08:56.544 --rc genhtml_legend=1 00:08:56.544 --rc geninfo_all_blocks=1 00:08:56.544 --rc geninfo_unexecuted_blocks=1 00:08:56.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.545 ' 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1702 -- # LCOV_OPTS=' 00:08:56.545 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.545 --rc genhtml_branch_coverage=1 00:08:56.545 --rc genhtml_function_coverage=1 00:08:56.545 --rc genhtml_legend=1 00:08:56.545 --rc geninfo_all_blocks=1 00:08:56.545 --rc geninfo_unexecuted_blocks=1 00:08:56.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.545 ' 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1703 -- # export 'LCOV=lcov 00:08:56.545 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.545 --rc genhtml_branch_coverage=1 00:08:56.545 --rc genhtml_function_coverage=1 00:08:56.545 --rc genhtml_legend=1 00:08:56.545 --rc geninfo_all_blocks=1 00:08:56.545 --rc geninfo_unexecuted_blocks=1 00:08:56.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.545 ' 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1703 -- # LCOV='lcov 00:08:56.545 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:56.545 --rc genhtml_branch_coverage=1 00:08:56.545 --rc genhtml_function_coverage=1 00:08:56.545 --rc genhtml_legend=1 00:08:56.545 --rc geninfo_all_blocks=1 00:08:56.545 --rc geninfo_unexecuted_blocks=1 00:08:56.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:56.545 ' 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:56.545 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:56.545 21:33:58 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:56.545 [2024-10-27 21:33:58.240922] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:56.545 [2024-10-27 21:33:58.240995] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3333211 ] 00:08:56.804 [2024-10-27 21:33:58.375819] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:56.804 [2024-10-27 21:33:58.419546] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.804 [2024-10-27 21:33:58.441983] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.063 INFO: Running with entropic power schedule (0xFF, 100). 00:08:57.063 INFO: Seed: 3216314785 00:08:57.063 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:08:57.063 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:08:57.063 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:57.063 INFO: A corpus is not provided, starting from an empty corpus 00:08:57.063 #2 INITED exec/s: 0 rss: 67Mb 00:08:57.063 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:57.063 This may also happen if the target rejected all inputs we tried so far 00:08:57.063 [2024-10-27 21:33:58.670162] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:57.580 NEW_FUNC[1/672]: 0x45e778 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:57.580 NEW_FUNC[2/672]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:57.580 #44 NEW cov: 11145 ft: 10782 corp: 2/7b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:57.580 #46 NEW cov: 11172 ft: 13950 corp: 3/13b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 2 EraseBytes-CopyPart- 00:08:57.838 #47 NEW cov: 11172 ft: 16049 corp: 4/19b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:57.838 #48 NEW cov: 11172 ft: 16164 corp: 5/25b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:57.838 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:57.838 #59 NEW cov: 11189 ft: 16565 corp: 6/31b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:58.096 #60 NEW cov: 11189 ft: 16879 corp: 7/37b lim: 6 exec/s: 60 rss: 75Mb L: 6/6 MS: 1 CopyPart- 00:08:58.096 #61 NEW cov: 11189 ft: 17244 corp: 8/43b lim: 6 exec/s: 61 rss: 76Mb L: 6/6 MS: 1 ChangeByte- 00:08:58.353 #63 NEW cov: 11189 ft: 17481 corp: 9/49b lim: 6 exec/s: 63 rss: 76Mb L: 6/6 MS: 2 EraseBytes-CopyPart- 00:08:58.353 #64 NEW cov: 11189 ft: 17819 corp: 10/55b lim: 6 exec/s: 64 rss: 76Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:58.612 #69 NEW cov: 11189 ft: 17862 corp: 11/61b lim: 6 exec/s: 69 rss: 76Mb L: 6/6 MS: 5 CrossOver-CrossOver-ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:58.612 #70 NEW cov: 11189 ft: 18039 corp: 12/67b lim: 6 exec/s: 70 rss: 76Mb L: 6/6 MS: 1 ChangeByte- 00:08:58.870 #71 NEW cov: 11189 ft: 18402 corp: 13/73b lim: 6 exec/s: 71 rss: 76Mb L: 6/6 MS: 1 ChangeByte- 00:08:58.870 #72 NEW cov: 11196 ft: 18477 corp: 14/79b lim: 6 exec/s: 72 rss: 76Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:59.129 #73 NEW cov: 11196 ft: 18516 corp: 15/85b lim: 6 exec/s: 73 rss: 76Mb L: 6/6 MS: 1 ChangeBit- 00:08:59.129 #74 NEW cov: 11196 ft: 18584 corp: 16/91b lim: 6 exec/s: 37 rss: 76Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:59.129 #74 DONE cov: 11196 ft: 18584 corp: 16/91b lim: 6 exec/s: 37 rss: 76Mb 00:08:59.129 Done 74 runs in 2 second(s) 00:08:59.129 [2024-10-27 21:34:00.745129] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:59.388 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:59.389 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:59.389 21:34:00 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:59.389 [2024-10-27 21:34:00.998632] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:08:59.389 [2024-10-27 21:34:00.998701] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3333736 ] 00:08:59.647 [2024-10-27 21:34:01.132759] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:59.647 [2024-10-27 21:34:01.176483] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.647 [2024-10-27 21:34:01.198678] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.905 INFO: Running with entropic power schedule (0xFF, 100). 00:08:59.905 INFO: Seed: 1683357405 00:08:59.905 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:08:59.905 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:08:59.905 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:59.905 INFO: A corpus is not provided, starting from an empty corpus 00:08:59.905 #2 INITED exec/s: 0 rss: 66Mb 00:08:59.905 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:59.905 This may also happen if the target rejected all inputs we tried so far 00:08:59.905 [2024-10-27 21:34:01.434644] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:59.905 [2024-10-27 21:34:01.502165] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.905 [2024-10-27 21:34:01.502188] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.905 [2024-10-27 21:34:01.502207] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.164 NEW_FUNC[1/674]: 0x45ed18 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:09:00.164 NEW_FUNC[2/674]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:00.164 #29 NEW cov: 11141 ft: 10781 corp: 2/5b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 2 InsertByte-CMP- DE: "\377\011"- 00:09:00.421 [2024-10-27 21:34:01.982072] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.421 [2024-10-27 21:34:01.982105] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.421 [2024-10-27 21:34:01.982123] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.421 #33 NEW cov: 11155 ft: 14093 corp: 3/9b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 4 CopyPart-ShuffleBytes-CrossOver-PersAutoDict- DE: "\377\011"- 00:09:00.679 [2024-10-27 21:34:02.166167] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.679 [2024-10-27 21:34:02.166190] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.679 [2024-10-27 21:34:02.166209] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.679 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:00.679 #34 NEW cov: 11175 ft: 15306 corp: 4/13b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 1 PersAutoDict- DE: "\377\011"- 00:09:00.679 [2024-10-27 21:34:02.350352] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.680 [2024-10-27 21:34:02.350376] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.680 [2024-10-27 21:34:02.350398] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.937 #38 NEW cov: 11175 ft: 16436 corp: 5/17b lim: 4 exec/s: 38 rss: 75Mb L: 4/4 MS: 4 ChangeByte-ChangeByte-CopyPart-CrossOver- 00:09:00.937 [2024-10-27 21:34:02.541360] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.937 [2024-10-27 21:34:02.541383] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.937 [2024-10-27 21:34:02.541401] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.937 #39 NEW cov: 11175 ft: 17192 corp: 6/21b lim: 4 exec/s: 39 rss: 77Mb L: 4/4 MS: 1 CrossOver- 00:09:01.195 [2024-10-27 21:34:02.721366] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.195 [2024-10-27 21:34:02.721387] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.195 [2024-10-27 21:34:02.721405] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.195 #49 NEW cov: 11175 ft: 17682 corp: 7/25b lim: 4 exec/s: 49 rss: 77Mb L: 4/4 MS: 5 InsertByte-ChangeByte-CrossOver-CrossOver-CopyPart- 00:09:01.195 [2024-10-27 21:34:02.896577] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.195 [2024-10-27 21:34:02.896598] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.195 [2024-10-27 21:34:02.896615] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.453 #65 NEW cov: 11175 ft: 17833 corp: 8/29b lim: 4 exec/s: 65 rss: 77Mb L: 4/4 MS: 1 PersAutoDict- DE: "\377\011"- 00:09:01.453 [2024-10-27 21:34:03.077527] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.453 [2024-10-27 21:34:03.077548] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.453 [2024-10-27 21:34:03.077566] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.710 #66 NEW cov: 11175 ft: 18037 corp: 9/33b lim: 4 exec/s: 66 rss: 77Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:01.710 [2024-10-27 21:34:03.271070] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.710 [2024-10-27 21:34:03.271092] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.711 [2024-10-27 21:34:03.271110] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.711 #72 NEW cov: 11182 ft: 18242 corp: 10/37b lim: 4 exec/s: 72 rss: 77Mb L: 4/4 MS: 1 CopyPart- 00:09:01.970 [2024-10-27 21:34:03.447404] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:01.970 [2024-10-27 21:34:03.447425] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:01.970 [2024-10-27 21:34:03.447443] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:01.970 #73 NEW cov: 11182 ft: 18358 corp: 11/41b lim: 4 exec/s: 36 rss: 77Mb L: 4/4 MS: 1 CopyPart- 00:09:01.970 #73 DONE cov: 11182 ft: 18358 corp: 11/41b lim: 4 exec/s: 36 rss: 77Mb 00:09:01.970 ###### Recommended dictionary. ###### 00:09:01.970 "\377\011" # Uses: 4 00:09:01.970 ###### End of recommended dictionary. ###### 00:09:01.970 Done 73 runs in 2 second(s) 00:09:01.970 [2024-10-27 21:34:03.569143] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:02.230 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:02.230 21:34:03 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:02.230 [2024-10-27 21:34:03.821028] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:09:02.230 [2024-10-27 21:34:03.821098] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3334144 ] 00:09:02.489 [2024-10-27 21:34:03.955806] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:02.489 [2024-10-27 21:34:03.999635] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.489 [2024-10-27 21:34:04.024124] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.489 INFO: Running with entropic power schedule (0xFF, 100). 00:09:02.489 INFO: Seed: 211379663 00:09:02.748 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:09:02.748 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:09:02.748 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:02.748 INFO: A corpus is not provided, starting from an empty corpus 00:09:02.748 #2 INITED exec/s: 0 rss: 67Mb 00:09:02.748 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:02.748 This may also happen if the target rejected all inputs we tried so far 00:09:02.748 [2024-10-27 21:34:04.254370] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:02.748 [2024-10-27 21:34:04.305566] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.007 NEW_FUNC[1/673]: 0x45f708 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:03.007 NEW_FUNC[2/673]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:03.007 #7 NEW cov: 11125 ft: 11050 corp: 2/9b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 5 CopyPart-CopyPart-CopyPart-ChangeBit-InsertRepeatedBytes- 00:09:03.266 [2024-10-27 21:34:04.756954] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.266 #8 NEW cov: 11141 ft: 14561 corp: 3/17b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:09:03.266 [2024-10-27 21:34:04.931633] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.525 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:03.525 #14 NEW cov: 11158 ft: 15069 corp: 4/25b lim: 8 exec/s: 0 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:03.525 [2024-10-27 21:34:05.105672] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.525 #20 NEW cov: 11158 ft: 15135 corp: 5/33b lim: 8 exec/s: 0 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:03.784 [2024-10-27 21:34:05.278522] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.784 #21 NEW cov: 11158 ft: 15637 corp: 6/41b lim: 8 exec/s: 21 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:03.784 [2024-10-27 21:34:05.454756] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: cmd 5 failed: Invalid argument 00:09:03.784 [2024-10-27 21:34:05.454791] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:04.043 NEW_FUNC[1/1]: 0x15997f8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3094 00:09:04.043 #22 NEW cov: 11168 ft: 15774 corp: 7/49b lim: 8 exec/s: 22 rss: 75Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:04.043 [2024-10-27 21:34:05.640243] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:04.043 #23 NEW cov: 11168 ft: 16751 corp: 8/57b lim: 8 exec/s: 23 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:04.302 [2024-10-27 21:34:05.823209] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:04.302 [2024-10-27 21:34:05.823240] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:04.302 #27 NEW cov: 11168 ft: 17202 corp: 9/65b lim: 8 exec/s: 27 rss: 75Mb L: 8/8 MS: 4 InsertRepeatedBytes-ChangeBinInt-EraseBytes-InsertRepeatedBytes- 00:09:04.302 [2024-10-27 21:34:06.006808] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:04.562 #28 NEW cov: 11175 ft: 17220 corp: 10/73b lim: 8 exec/s: 28 rss: 76Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:04.562 [2024-10-27 21:34:06.181255] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:04.562 #34 NEW cov: 11175 ft: 17605 corp: 11/81b lim: 8 exec/s: 17 rss: 76Mb L: 8/8 MS: 1 CopyPart- 00:09:04.562 #34 DONE cov: 11175 ft: 17605 corp: 11/81b lim: 8 exec/s: 17 rss: 76Mb 00:09:04.562 Done 34 runs in 2 second(s) 00:09:04.821 [2024-10-27 21:34:06.304131] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:04.821 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:04.821 21:34:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:05.081 [2024-10-27 21:34:06.550266] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:09:05.081 [2024-10-27 21:34:06.550341] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3334576 ] 00:09:05.081 [2024-10-27 21:34:06.685828] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:05.081 [2024-10-27 21:34:06.730650] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.081 [2024-10-27 21:34:06.754483] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.340 INFO: Running with entropic power schedule (0xFF, 100). 00:09:05.340 INFO: Seed: 2946382197 00:09:05.340 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:09:05.340 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:09:05.340 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:05.340 INFO: A corpus is not provided, starting from an empty corpus 00:09:05.340 #2 INITED exec/s: 0 rss: 65Mb 00:09:05.340 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:05.340 This may also happen if the target rejected all inputs we tried so far 00:09:05.340 [2024-10-27 21:34:06.992787] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:05.858 NEW_FUNC[1/673]: 0x45fdf8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:05.858 NEW_FUNC[2/673]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:05.858 #56 NEW cov: 11131 ft: 11096 corp: 2/33b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 4 InsertRepeatedBytes-ChangeByte-InsertRepeatedBytes-CopyPart- 00:09:06.117 #62 NEW cov: 11149 ft: 14319 corp: 3/65b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBit- 00:09:06.117 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:06.117 #63 NEW cov: 11166 ft: 15996 corp: 4/97b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.376 #64 NEW cov: 11166 ft: 16196 corp: 5/129b lim: 32 exec/s: 64 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:09:06.634 #65 NEW cov: 11166 ft: 16510 corp: 6/161b lim: 32 exec/s: 65 rss: 73Mb L: 32/32 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:09:06.634 #71 NEW cov: 11166 ft: 16823 corp: 7/193b lim: 32 exec/s: 71 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:06.893 #77 NEW cov: 11166 ft: 16970 corp: 8/225b lim: 32 exec/s: 77 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:09:07.152 #78 NEW cov: 11166 ft: 17047 corp: 9/257b lim: 32 exec/s: 78 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:09:07.152 #79 NEW cov: 11173 ft: 17126 corp: 10/289b lim: 32 exec/s: 79 rss: 73Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:07.411 #90 NEW cov: 11173 ft: 17148 corp: 11/321b lim: 32 exec/s: 45 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:09:07.411 #90 DONE cov: 11173 ft: 17148 corp: 11/321b lim: 32 exec/s: 45 rss: 73Mb 00:09:07.411 ###### Recommended dictionary. ###### 00:09:07.411 "\377\377\377\377\377\377\377\377" # Uses: 1 00:09:07.411 ###### End of recommended dictionary. ###### 00:09:07.411 Done 90 runs in 2 second(s) 00:09:07.411 [2024-10-27 21:34:09.033122] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:07.671 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:07.671 21:34:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:07.671 [2024-10-27 21:34:09.285829] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:09:07.671 [2024-10-27 21:34:09.285898] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3335107 ] 00:09:07.930 [2024-10-27 21:34:09.421066] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:07.930 [2024-10-27 21:34:09.465279] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.930 [2024-10-27 21:34:09.487360] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.192 INFO: Running with entropic power schedule (0xFF, 100). 00:09:08.192 INFO: Seed: 1379413825 00:09:08.192 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:09:08.192 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:09:08.192 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:08.192 INFO: A corpus is not provided, starting from an empty corpus 00:09:08.192 #2 INITED exec/s: 0 rss: 66Mb 00:09:08.192 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:08.192 This may also happen if the target rejected all inputs we tried so far 00:09:08.192 [2024-10-27 21:34:09.717731] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:08.451 NEW_FUNC[1/673]: 0x460678 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:08.451 NEW_FUNC[2/673]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:08.451 #36 NEW cov: 11104 ft: 10771 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 InsertByte-InsertByte-InsertRepeatedBytes-CopyPart- 00:09:08.709 #37 NEW cov: 11151 ft: 14089 corp: 3/65b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:08.968 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:08.968 #43 NEW cov: 11168 ft: 15048 corp: 4/97b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:09:09.226 #44 NEW cov: 11168 ft: 15252 corp: 5/129b lim: 32 exec/s: 44 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:09.226 #45 NEW cov: 11168 ft: 16363 corp: 6/161b lim: 32 exec/s: 45 rss: 75Mb L: 32/32 MS: 1 CMP- DE: "\005\000"- 00:09:09.485 #46 NEW cov: 11168 ft: 16451 corp: 7/193b lim: 32 exec/s: 46 rss: 75Mb L: 32/32 MS: 1 PersAutoDict- DE: "\005\000"- 00:09:09.743 #47 NEW cov: 11168 ft: 16689 corp: 8/225b lim: 32 exec/s: 47 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:09.743 #53 NEW cov: 11168 ft: 17218 corp: 9/257b lim: 32 exec/s: 53 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:10.001 #54 NEW cov: 11175 ft: 17593 corp: 10/289b lim: 32 exec/s: 54 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:10.259 #60 NEW cov: 11175 ft: 17894 corp: 11/321b lim: 32 exec/s: 30 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:10.259 #60 DONE cov: 11175 ft: 17894 corp: 11/321b lim: 32 exec/s: 30 rss: 75Mb 00:09:10.259 ###### Recommended dictionary. ###### 00:09:10.259 "\005\000" # Uses: 1 00:09:10.259 ###### End of recommended dictionary. ###### 00:09:10.259 Done 60 runs in 2 second(s) 00:09:10.259 [2024-10-27 21:34:11.845130] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:10.518 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:10.518 21:34:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:10.518 [2024-10-27 21:34:12.089968] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:09:10.518 [2024-10-27 21:34:12.090043] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3335647 ] 00:09:10.518 [2024-10-27 21:34:12.224067] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:10.776 [2024-10-27 21:34:12.268110] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.776 [2024-10-27 21:34:12.290836] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.776 INFO: Running with entropic power schedule (0xFF, 100). 00:09:10.776 INFO: Seed: 4185415657 00:09:10.776 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:09:10.776 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:09:10.776 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:10.776 INFO: A corpus is not provided, starting from an empty corpus 00:09:10.776 #2 INITED exec/s: 0 rss: 66Mb 00:09:10.776 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:10.776 This may also happen if the target rejected all inputs we tried so far 00:09:11.035 [2024-10-27 21:34:12.532475] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:11.035 [2024-10-27 21:34:12.575985] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.035 [2024-10-27 21:34:12.576022] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.294 NEW_FUNC[1/673]: 0x461078 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:11.294 NEW_FUNC[2/673]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:11.294 #14 NEW cov: 11127 ft: 11102 corp: 2/14b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 2 InsertByte-InsertRepeatedBytes- 00:09:11.553 [2024-10-27 21:34:13.034220] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.553 [2024-10-27 21:34:13.034261] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.553 NEW_FUNC[1/1]: 0x20b3e38 in spdk_bit_array_get /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/bit_array.c:152 00:09:11.553 #15 NEW cov: 11147 ft: 14793 corp: 3/27b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:11.553 [2024-10-27 21:34:13.216452] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.553 [2024-10-27 21:34:13.216483] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.812 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:11.812 #21 NEW cov: 11167 ft: 15770 corp: 4/40b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 ChangeByte- 00:09:11.812 [2024-10-27 21:34:13.386371] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.812 [2024-10-27 21:34:13.386401] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.812 #22 NEW cov: 11167 ft: 16994 corp: 5/53b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 ChangeByte- 00:09:12.071 [2024-10-27 21:34:13.567048] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.071 [2024-10-27 21:34:13.567080] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.071 #28 NEW cov: 11167 ft: 17204 corp: 6/66b lim: 13 exec/s: 28 rss: 75Mb L: 13/13 MS: 1 CMP- DE: "\376\377\377\364"- 00:09:12.071 [2024-10-27 21:34:13.736996] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.071 [2024-10-27 21:34:13.737027] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.330 #34 NEW cov: 11167 ft: 17730 corp: 7/79b lim: 13 exec/s: 34 rss: 75Mb L: 13/13 MS: 1 PersAutoDict- DE: "\376\377\377\364"- 00:09:12.330 [2024-10-27 21:34:13.905574] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.330 [2024-10-27 21:34:13.905604] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.330 #35 NEW cov: 11167 ft: 17974 corp: 8/92b lim: 13 exec/s: 35 rss: 75Mb L: 13/13 MS: 1 CrossOver- 00:09:12.588 [2024-10-27 21:34:14.084754] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.588 [2024-10-27 21:34:14.084783] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.588 #41 NEW cov: 11167 ft: 18010 corp: 9/105b lim: 13 exec/s: 41 rss: 76Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:12.588 [2024-10-27 21:34:14.253973] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.588 [2024-10-27 21:34:14.254003] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.847 #42 NEW cov: 11174 ft: 18114 corp: 10/118b lim: 13 exec/s: 42 rss: 76Mb L: 13/13 MS: 1 ChangeByte- 00:09:12.847 [2024-10-27 21:34:14.433888] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.847 [2024-10-27 21:34:14.433918] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.847 #43 NEW cov: 11174 ft: 18220 corp: 11/131b lim: 13 exec/s: 21 rss: 76Mb L: 13/13 MS: 1 ChangeBit- 00:09:12.847 #43 DONE cov: 11174 ft: 18220 corp: 11/131b lim: 13 exec/s: 21 rss: 76Mb 00:09:12.847 ###### Recommended dictionary. ###### 00:09:12.847 "\376\377\377\364" # Uses: 1 00:09:12.847 ###### End of recommended dictionary. ###### 00:09:12.847 Done 43 runs in 2 second(s) 00:09:12.847 [2024-10-27 21:34:14.551126] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:13.106 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:13.106 21:34:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:13.106 [2024-10-27 21:34:14.803698] Starting SPDK v25.01-pre git sha1 169c3cd04 / DPDK 24.11.0-rc1 initialization... 00:09:13.106 [2024-10-27 21:34:14.803766] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3336109 ] 00:09:13.365 [2024-10-27 21:34:14.939169] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc1 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:13.365 [2024-10-27 21:34:14.984280] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.365 [2024-10-27 21:34:15.010160] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.624 INFO: Running with entropic power schedule (0xFF, 100). 00:09:13.624 INFO: Seed: 2609446000 00:09:13.624 INFO: Loaded 1 modules (383607 inline 8-bit counters): 383607 [0x2a646cc, 0x2ac2143), 00:09:13.624 INFO: Loaded 1 PC tables (383607 PCs): 383607 [0x2ac2148,0x309c8b8), 00:09:13.624 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:13.624 INFO: A corpus is not provided, starting from an empty corpus 00:09:13.624 #2 INITED exec/s: 0 rss: 66Mb 00:09:13.624 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:13.624 This may also happen if the target rejected all inputs we tried so far 00:09:13.624 [2024-10-27 21:34:15.253218] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:13.624 [2024-10-27 21:34:15.295959] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.624 [2024-10-27 21:34:15.295990] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.142 NEW_FUNC[1/672]: 0x461d68 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:14.142 NEW_FUNC[2/672]: 0x464288 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:14.142 #12 NEW cov: 11124 ft: 11104 corp: 2/10b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 5 InsertByte-CopyPart-InsertByte-InsertRepeatedBytes-CopyPart- 00:09:14.142 [2024-10-27 21:34:15.750349] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.142 [2024-10-27 21:34:15.750390] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.401 NEW_FUNC[1/2]: 0x1fa26c8 in msg_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:833 00:09:14.401 NEW_FUNC[2/2]: 0x210c8a8 in spdk_u32log2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/math.c:20 00:09:14.401 #14 NEW cov: 11145 ft: 14607 corp: 3/19b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 2 InsertByte-InsertRepeatedBytes- 00:09:14.401 [2024-10-27 21:34:15.947871] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.401 [2024-10-27 21:34:15.947903] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.401 NEW_FUNC[1/1]: 0x1c0c068 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:14.401 #25 NEW cov: 11165 ft: 15383 corp: 4/28b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:14.401 [2024-10-27 21:34:16.122469] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.401 [2024-10-27 21:34:16.122500] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.661 #26 NEW cov: 11165 ft: 16612 corp: 5/37b lim: 9 exec/s: 26 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "\377\017"- 00:09:14.661 [2024-10-27 21:34:16.309120] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.661 [2024-10-27 21:34:16.309153] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.920 #27 NEW cov: 11165 ft: 16869 corp: 6/46b lim: 9 exec/s: 27 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:14.920 [2024-10-27 21:34:16.483547] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.920 [2024-10-27 21:34:16.483578] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.920 #38 NEW cov: 11165 ft: 17396 corp: 7/55b lim: 9 exec/s: 38 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:15.178 [2024-10-27 21:34:16.655815] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.178 [2024-10-27 21:34:16.655846] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.178 #44 NEW cov: 11165 ft: 17665 corp: 8/64b lim: 9 exec/s: 44 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:15.178 [2024-10-27 21:34:16.829048] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.178 [2024-10-27 21:34:16.829079] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.437 #45 NEW cov: 11165 ft: 18083 corp: 9/73b lim: 9 exec/s: 45 rss: 76Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:09:15.437 [2024-10-27 21:34:17.005581] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.437 [2024-10-27 21:34:17.005611] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.437 #46 NEW cov: 11172 ft: 18131 corp: 10/82b lim: 9 exec/s: 46 rss: 76Mb L: 9/9 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:09:15.696 [2024-10-27 21:34:17.179344] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.696 [2024-10-27 21:34:17.179374] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.696 #47 NEW cov: 11172 ft: 18436 corp: 11/91b lim: 9 exec/s: 23 rss: 76Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:15.696 #47 DONE cov: 11172 ft: 18436 corp: 11/91b lim: 9 exec/s: 23 rss: 76Mb 00:09:15.696 ###### Recommended dictionary. ###### 00:09:15.696 "\377\017" # Uses: 2 00:09:15.696 "\000\000\000\000\000\000\000\000" # Uses: 1 00:09:15.696 ###### End of recommended dictionary. ###### 00:09:15.697 Done 47 runs in 2 second(s) 00:09:15.697 [2024-10-27 21:34:17.300133] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:15.956 21:34:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:15.956 21:34:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:15.956 21:34:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:15.956 21:34:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:15.956 00:09:15.956 real 0m19.734s 00:09:15.956 user 0m26.889s 00:09:15.956 sys 0m1.794s 00:09:15.956 21:34:17 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:15.956 21:34:17 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:15.956 ************************************ 00:09:15.956 END TEST vfio_llvm_fuzz 00:09:15.956 ************************************ 00:09:15.956 00:09:15.956 real 1m27.046s 00:09:15.956 user 2m6.303s 00:09:15.956 sys 0m11.060s 00:09:15.956 21:34:17 llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:15.956 21:34:17 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:15.956 ************************************ 00:09:15.956 END TEST llvm_fuzz 00:09:15.956 ************************************ 00:09:15.956 21:34:17 -- spdk/autotest.sh@374 -- # [[ '' -eq 1 ]] 00:09:15.956 21:34:17 -- spdk/autotest.sh@381 -- # trap - SIGINT SIGTERM EXIT 00:09:15.956 21:34:17 -- spdk/autotest.sh@383 -- # timing_enter post_cleanup 00:09:15.956 21:34:17 -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:15.956 21:34:17 -- common/autotest_common.sh@10 -- # set +x 00:09:15.956 21:34:17 -- spdk/autotest.sh@384 -- # autotest_cleanup 00:09:15.956 21:34:17 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:09:15.956 21:34:17 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:09:15.956 21:34:17 -- common/autotest_common.sh@10 -- # set +x 00:09:22.525 INFO: APP EXITING 00:09:22.525 INFO: killing all VMs 00:09:22.525 INFO: killing vhost app 00:09:22.525 INFO: EXIT DONE 00:09:25.812 Waiting for block devices as requested 00:09:25.812 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:25.812 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:25.812 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:26.071 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:26.071 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:26.071 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:26.071 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:26.330 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:26.330 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:26.330 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:26.589 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:26.589 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:26.589 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:26.847 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:26.847 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:26.847 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:27.104 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:30.389 Cleaning 00:09:30.389 Removing: /dev/shm/spdk_tgt_trace.pid3305787 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3303324 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3304573 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3305787 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3306503 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3307499 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3307716 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3308724 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3308962 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3309354 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3309724 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3310079 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3310423 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3310767 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3311050 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3311277 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3311601 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3312514 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3315686 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3315990 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3316288 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3316548 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3317114 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3317135 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3317702 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3317963 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3318259 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3318276 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3318568 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3318731 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3319214 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3319506 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3319793 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3319948 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3320635 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3321164 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3321705 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3322067 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3322653 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3323426 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3324154 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3324532 00:09:30.389 Removing: /var/run/dpdk/spdk_pid3324980 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3325515 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3326044 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3326375 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3326892 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3327421 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3327833 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3328249 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3328786 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3329320 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3329642 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3330148 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3330682 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3331153 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3331501 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3332041 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3332570 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3333211 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3333736 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3334144 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3334576 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3335107 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3335647 00:09:30.390 Removing: /var/run/dpdk/spdk_pid3336109 00:09:30.390 Clean 00:09:30.390 21:34:31 -- common/autotest_common.sh@1449 -- # return 0 00:09:30.390 21:34:31 -- spdk/autotest.sh@385 -- # timing_exit post_cleanup 00:09:30.390 21:34:31 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:30.390 21:34:31 -- common/autotest_common.sh@10 -- # set +x 00:09:30.390 21:34:31 -- spdk/autotest.sh@387 -- # timing_exit autotest 00:09:30.390 21:34:31 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:30.390 21:34:31 -- common/autotest_common.sh@10 -- # set +x 00:09:30.390 21:34:31 -- spdk/autotest.sh@388 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:30.390 21:34:31 -- spdk/autotest.sh@390 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:30.390 21:34:31 -- spdk/autotest.sh@390 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:30.390 21:34:31 -- spdk/autotest.sh@392 -- # [[ y == y ]] 00:09:30.390 21:34:32 -- spdk/autotest.sh@394 -- # hostname 00:09:30.390 21:34:32 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:30.647 geninfo: WARNING: invalid characters removed from testname! 00:09:33.933 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:40.493 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:43.021 21:34:44 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:51.251 21:34:52 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:56.536 21:34:57 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:01.799 21:35:02 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:07.068 21:35:07 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:12.365 21:35:13 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:17.632 21:35:18 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:17.632 21:35:18 -- common/autotest_common.sh@1688 -- $ [[ y == y ]] 00:10:17.632 21:35:18 -- common/autotest_common.sh@1689 -- $ lcov --version 00:10:17.632 21:35:18 -- common/autotest_common.sh@1689 -- $ awk '{print $NF}' 00:10:17.632 21:35:18 -- common/autotest_common.sh@1689 -- $ lt 1.15 2 00:10:17.632 21:35:18 -- scripts/common.sh@373 -- $ cmp_versions 1.15 '<' 2 00:10:17.632 21:35:18 -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:10:17.632 21:35:18 -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:10:17.632 21:35:18 -- scripts/common.sh@336 -- $ IFS=.-: 00:10:17.632 21:35:18 -- scripts/common.sh@336 -- $ read -ra ver1 00:10:17.632 21:35:18 -- scripts/common.sh@337 -- $ IFS=.-: 00:10:17.632 21:35:18 -- scripts/common.sh@337 -- $ read -ra ver2 00:10:17.632 21:35:18 -- scripts/common.sh@338 -- $ local 'op=<' 00:10:17.632 21:35:18 -- scripts/common.sh@340 -- $ ver1_l=2 00:10:17.632 21:35:18 -- scripts/common.sh@341 -- $ ver2_l=1 00:10:17.632 21:35:18 -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:10:17.632 21:35:18 -- scripts/common.sh@344 -- $ case "$op" in 00:10:17.632 21:35:18 -- scripts/common.sh@345 -- $ : 1 00:10:17.632 21:35:18 -- scripts/common.sh@364 -- $ (( v = 0 )) 00:10:17.632 21:35:18 -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:17.632 21:35:18 -- scripts/common.sh@365 -- $ decimal 1 00:10:17.632 21:35:18 -- scripts/common.sh@353 -- $ local d=1 00:10:17.632 21:35:18 -- scripts/common.sh@354 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:10:17.632 21:35:18 -- scripts/common.sh@355 -- $ echo 1 00:10:17.632 21:35:18 -- scripts/common.sh@365 -- $ ver1[v]=1 00:10:17.632 21:35:18 -- scripts/common.sh@366 -- $ decimal 2 00:10:17.632 21:35:18 -- scripts/common.sh@353 -- $ local d=2 00:10:17.632 21:35:18 -- scripts/common.sh@354 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:10:17.632 21:35:18 -- scripts/common.sh@355 -- $ echo 2 00:10:17.632 21:35:18 -- scripts/common.sh@366 -- $ ver2[v]=2 00:10:17.632 21:35:18 -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:10:17.632 21:35:18 -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:10:17.632 21:35:18 -- scripts/common.sh@368 -- $ return 0 00:10:17.632 21:35:18 -- common/autotest_common.sh@1690 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:17.632 21:35:18 -- common/autotest_common.sh@1702 -- $ export 'LCOV_OPTS= 00:10:17.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:17.632 --rc genhtml_branch_coverage=1 00:10:17.632 --rc genhtml_function_coverage=1 00:10:17.632 --rc genhtml_legend=1 00:10:17.632 --rc geninfo_all_blocks=1 00:10:17.632 --rc geninfo_unexecuted_blocks=1 00:10:17.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:17.632 ' 00:10:17.632 21:35:18 -- common/autotest_common.sh@1702 -- $ LCOV_OPTS=' 00:10:17.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:17.632 --rc genhtml_branch_coverage=1 00:10:17.632 --rc genhtml_function_coverage=1 00:10:17.632 --rc genhtml_legend=1 00:10:17.632 --rc geninfo_all_blocks=1 00:10:17.632 --rc geninfo_unexecuted_blocks=1 00:10:17.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:17.632 ' 00:10:17.632 21:35:18 -- common/autotest_common.sh@1703 -- $ export 'LCOV=lcov 00:10:17.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:17.632 --rc genhtml_branch_coverage=1 00:10:17.632 --rc genhtml_function_coverage=1 00:10:17.632 --rc genhtml_legend=1 00:10:17.632 --rc geninfo_all_blocks=1 00:10:17.632 --rc geninfo_unexecuted_blocks=1 00:10:17.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:17.632 ' 00:10:17.632 21:35:18 -- common/autotest_common.sh@1703 -- $ LCOV='lcov 00:10:17.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:17.632 --rc genhtml_branch_coverage=1 00:10:17.632 --rc genhtml_function_coverage=1 00:10:17.632 --rc genhtml_legend=1 00:10:17.632 --rc geninfo_all_blocks=1 00:10:17.632 --rc geninfo_unexecuted_blocks=1 00:10:17.632 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:17.632 ' 00:10:17.632 21:35:18 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:17.632 21:35:18 -- scripts/common.sh@15 -- $ shopt -s extglob 00:10:17.632 21:35:18 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:17.632 21:35:18 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:17.632 21:35:18 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:17.632 21:35:18 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:17.633 21:35:18 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:17.633 21:35:18 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:17.633 21:35:18 -- paths/export.sh@5 -- $ export PATH 00:10:17.633 21:35:18 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:17.633 21:35:18 -- common/autobuild_common.sh@485 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:17.633 21:35:18 -- common/autobuild_common.sh@486 -- $ date +%s 00:10:17.633 21:35:18 -- common/autobuild_common.sh@486 -- $ mktemp -dt spdk_1730061318.XXXXXX 00:10:17.633 21:35:18 -- common/autobuild_common.sh@486 -- $ SPDK_WORKSPACE=/tmp/spdk_1730061318.X6EIeD 00:10:17.633 21:35:18 -- common/autobuild_common.sh@488 -- $ [[ -n '' ]] 00:10:17.633 21:35:18 -- common/autobuild_common.sh@492 -- $ '[' -n main ']' 00:10:17.633 21:35:18 -- common/autobuild_common.sh@493 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:17.633 21:35:18 -- common/autobuild_common.sh@493 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:10:17.633 21:35:18 -- common/autobuild_common.sh@499 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:17.633 21:35:18 -- common/autobuild_common.sh@501 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:17.633 21:35:18 -- common/autobuild_common.sh@502 -- $ get_config_params 00:10:17.633 21:35:18 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:10:17.633 21:35:18 -- common/autotest_common.sh@10 -- $ set +x 00:10:17.633 21:35:18 -- common/autobuild_common.sh@502 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:10:17.633 21:35:18 -- common/autobuild_common.sh@504 -- $ start_monitor_resources 00:10:17.633 21:35:18 -- pm/common@17 -- $ local monitor 00:10:17.633 21:35:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.633 21:35:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.633 21:35:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.633 21:35:18 -- pm/common@21 -- $ date +%s 00:10:17.633 21:35:18 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:17.633 21:35:18 -- pm/common@21 -- $ date +%s 00:10:17.633 21:35:18 -- pm/common@25 -- $ sleep 1 00:10:17.633 21:35:18 -- pm/common@21 -- $ date +%s 00:10:17.633 21:35:18 -- pm/common@21 -- $ date +%s 00:10:17.633 21:35:18 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1730061318 00:10:17.633 21:35:18 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1730061318 00:10:17.633 21:35:18 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1730061318 00:10:17.633 21:35:18 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1730061318 00:10:17.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1730061318_collect-cpu-load.pm.log 00:10:17.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1730061318_collect-vmstat.pm.log 00:10:17.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1730061318_collect-cpu-temp.pm.log 00:10:17.633 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1730061318_collect-bmc-pm.bmc.pm.log 00:10:17.892 21:35:19 -- common/autobuild_common.sh@505 -- $ trap stop_monitor_resources EXIT 00:10:17.892 21:35:19 -- spdk/autopackage.sh@10 -- $ [[ 0 -eq 1 ]] 00:10:17.892 21:35:19 -- spdk/autopackage.sh@14 -- $ timing_finish 00:10:17.892 21:35:19 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:17.892 21:35:19 -- common/autotest_common.sh@737 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:17.892 21:35:19 -- common/autotest_common.sh@740 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:18.150 21:35:19 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:10:18.150 21:35:19 -- pm/common@29 -- $ signal_monitor_resources TERM 00:10:18.150 21:35:19 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:10:18.150 21:35:19 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:18.150 21:35:19 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:10:18.150 21:35:19 -- pm/common@44 -- $ pid=3344639 00:10:18.150 21:35:19 -- pm/common@50 -- $ kill -TERM 3344639 00:10:18.150 21:35:19 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:18.150 21:35:19 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:10:18.150 21:35:19 -- pm/common@44 -- $ pid=3344641 00:10:18.150 21:35:19 -- pm/common@50 -- $ kill -TERM 3344641 00:10:18.150 21:35:19 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:18.150 21:35:19 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:10:18.150 21:35:19 -- pm/common@44 -- $ pid=3344643 00:10:18.150 21:35:19 -- pm/common@50 -- $ kill -TERM 3344643 00:10:18.150 21:35:19 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:18.150 21:35:19 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:10:18.150 21:35:19 -- pm/common@44 -- $ pid=3344665 00:10:18.150 21:35:19 -- pm/common@50 -- $ sudo -E kill -TERM 3344665 00:10:18.150 + [[ -n 3177714 ]] 00:10:18.150 + sudo kill 3177714 00:10:18.158 [Pipeline] } 00:10:18.171 [Pipeline] // stage 00:10:18.175 [Pipeline] } 00:10:18.188 [Pipeline] // timeout 00:10:18.192 [Pipeline] } 00:10:18.205 [Pipeline] // catchError 00:10:18.209 [Pipeline] } 00:10:18.221 [Pipeline] // wrap 00:10:18.224 [Pipeline] } 00:10:18.234 [Pipeline] // catchError 00:10:18.241 [Pipeline] stage 00:10:18.243 [Pipeline] { (Epilogue) 00:10:18.253 [Pipeline] catchError 00:10:18.255 [Pipeline] { 00:10:18.265 [Pipeline] echo 00:10:18.267 Cleanup processes 00:10:18.272 [Pipeline] sh 00:10:18.550 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:18.550 3344776 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:10:18.550 3345202 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:18.565 [Pipeline] sh 00:10:18.849 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:18.849 ++ grep -v 'sudo pgrep' 00:10:18.849 ++ awk '{print $1}' 00:10:18.849 + sudo kill -9 3344776 00:10:18.861 [Pipeline] sh 00:10:19.140 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:19.140 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:19.140 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:20.512 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:30.482 [Pipeline] sh 00:10:30.763 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:30.763 Artifacts sizes are good 00:10:30.777 [Pipeline] archiveArtifacts 00:10:30.784 Archiving artifacts 00:10:30.927 [Pipeline] sh 00:10:31.210 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:31.225 [Pipeline] cleanWs 00:10:31.235 [WS-CLEANUP] Deleting project workspace... 00:10:31.235 [WS-CLEANUP] Deferred wipeout is used... 00:10:31.241 [WS-CLEANUP] done 00:10:31.243 [Pipeline] } 00:10:31.260 [Pipeline] // catchError 00:10:31.272 [Pipeline] sh 00:10:31.608 + logger -p user.info -t JENKINS-CI 00:10:31.617 [Pipeline] } 00:10:31.631 [Pipeline] // stage 00:10:31.636 [Pipeline] } 00:10:31.650 [Pipeline] // node 00:10:31.655 [Pipeline] End of Pipeline 00:10:31.693 Finished: SUCCESS