00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v23.11" build number 346 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3011 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.016 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.018 The recommended git tool is: git 00:00:00.018 using credential 00000000-0000-0000-0000-000000000002 00:00:00.019 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.031 Fetching changes from the remote Git repository 00:00:00.032 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.047 Using shallow fetch with depth 1 00:00:00.047 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.047 > git --version # timeout=10 00:00:00.076 > git --version # 'git version 2.39.2' 00:00:00.076 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.077 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.077 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.614 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.625 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.639 Checking out Revision 6201031def5bfb7f90a861bc162998684798607e (FETCH_HEAD) 00:00:02.639 > git config core.sparsecheckout # timeout=10 00:00:02.651 > git read-tree -mu HEAD # timeout=10 00:00:02.668 > git checkout -f 6201031def5bfb7f90a861bc162998684798607e # timeout=5 00:00:02.685 Commit message: "scripts/kid: Add issue 3354" 00:00:02.686 > git rev-list --no-walk 6201031def5bfb7f90a861bc162998684798607e # timeout=10 00:00:02.782 [Pipeline] Start of Pipeline 00:00:02.799 [Pipeline] library 00:00:02.801 Loading library shm_lib@master 00:00:02.801 Library shm_lib@master is cached. Copying from home. 00:00:02.820 [Pipeline] node 00:00:02.846 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.848 [Pipeline] { 00:00:02.861 [Pipeline] catchError 00:00:02.864 [Pipeline] { 00:00:02.879 [Pipeline] wrap 00:00:02.888 [Pipeline] { 00:00:02.893 [Pipeline] stage 00:00:02.894 [Pipeline] { (Prologue) 00:00:03.041 [Pipeline] sh 00:00:03.324 + logger -p user.info -t JENKINS-CI 00:00:03.339 [Pipeline] echo 00:00:03.340 Node: WFP20 00:00:03.345 [Pipeline] sh 00:00:03.630 [Pipeline] setCustomBuildProperty 00:00:03.639 [Pipeline] echo 00:00:03.640 Cleanup processes 00:00:03.644 [Pipeline] sh 00:00:03.923 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.924 344579 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.935 [Pipeline] sh 00:00:04.214 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.214 ++ grep -v 'sudo pgrep' 00:00:04.214 ++ awk '{print $1}' 00:00:04.214 + sudo kill -9 00:00:04.214 + true 00:00:04.227 [Pipeline] cleanWs 00:00:04.235 [WS-CLEANUP] Deleting project workspace... 00:00:04.235 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.240 [WS-CLEANUP] done 00:00:04.244 [Pipeline] setCustomBuildProperty 00:00:04.257 [Pipeline] sh 00:00:04.531 + sudo git config --global --replace-all safe.directory '*' 00:00:04.605 [Pipeline] nodesByLabel 00:00:04.607 Found a total of 1 nodes with the 'sorcerer' label 00:00:04.616 [Pipeline] httpRequest 00:00:04.620 HttpMethod: GET 00:00:04.621 URL: http://10.211.164.96/packages/jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:04.624 Sending request to url: http://10.211.164.96/packages/jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:04.626 Response Code: HTTP/1.1 200 OK 00:00:04.626 Success: Status code 200 is in the accepted range: 200,404 00:00:04.627 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:05.417 [Pipeline] sh 00:00:05.696 + tar --no-same-owner -xf jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:05.716 [Pipeline] httpRequest 00:00:05.720 HttpMethod: GET 00:00:05.721 URL: http://10.211.164.96/packages/spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:05.721 Sending request to url: http://10.211.164.96/packages/spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:05.738 Response Code: HTTP/1.1 200 OK 00:00:05.738 Success: Status code 200 is in the accepted range: 200,404 00:00:05.738 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:55.410 [Pipeline] sh 00:00:55.692 + tar --no-same-owner -xf spdk_36faa8c312bf9059b86e0f503d7fd6b43c1498e6.tar.gz 00:00:58.272 [Pipeline] sh 00:00:58.555 + git -C spdk log --oneline -n5 00:00:58.555 36faa8c31 bdev/nvme: Fix the case that namespace was removed during reset 00:00:58.555 e2cb5a5ee bdev/nvme: Factor out nvme_ns active/inactive check into a helper function 00:00:58.555 4b134b4ab bdev/nvme: Delay callbacks when the next operation is a failover 00:00:58.555 d2ea4ecb1 llvm/vfio: Suppress checking leaks for `spdk_nvme_ctrlr_alloc_io_qpair` 00:00:58.555 3b33f4333 test/nvme/cuse: Fix typo 00:00:58.575 [Pipeline] withCredentials 00:00:58.585 > git --version # timeout=10 00:00:58.599 > git --version # 'git version 2.39.2' 00:00:58.616 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:58.619 [Pipeline] { 00:00:58.629 [Pipeline] retry 00:00:58.631 [Pipeline] { 00:00:58.649 [Pipeline] sh 00:00:58.933 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:00:59.204 [Pipeline] } 00:00:59.227 [Pipeline] // retry 00:00:59.232 [Pipeline] } 00:00:59.252 [Pipeline] // withCredentials 00:00:59.264 [Pipeline] httpRequest 00:00:59.269 HttpMethod: GET 00:00:59.270 URL: http://10.211.164.96/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:59.273 Sending request to url: http://10.211.164.96/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:59.284 Response Code: HTTP/1.1 200 OK 00:00:59.285 Success: Status code 200 is in the accepted range: 200,404 00:00:59.285 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:06.180 [Pipeline] sh 00:01:06.462 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:07.849 [Pipeline] sh 00:01:08.131 + git -C dpdk log --oneline -n5 00:01:08.131 eeb0605f11 version: 23.11.0 00:01:08.131 238778122a doc: update release notes for 23.11 00:01:08.131 46aa6b3cfc doc: fix description of RSS features 00:01:08.131 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:08.131 7e421ae345 devtools: support skipping forbid rule check 00:01:08.146 [Pipeline] } 00:01:08.165 [Pipeline] // stage 00:01:08.174 [Pipeline] stage 00:01:08.176 [Pipeline] { (Prepare) 00:01:08.198 [Pipeline] writeFile 00:01:08.216 [Pipeline] sh 00:01:08.497 + logger -p user.info -t JENKINS-CI 00:01:08.509 [Pipeline] sh 00:01:08.791 + logger -p user.info -t JENKINS-CI 00:01:08.803 [Pipeline] sh 00:01:09.084 + cat autorun-spdk.conf 00:01:09.085 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:09.085 SPDK_RUN_UBSAN=1 00:01:09.085 SPDK_TEST_FUZZER=1 00:01:09.085 SPDK_TEST_FUZZER_SHORT=1 00:01:09.085 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:09.085 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:09.092 RUN_NIGHTLY=1 00:01:09.096 [Pipeline] readFile 00:01:09.117 [Pipeline] withEnv 00:01:09.119 [Pipeline] { 00:01:09.131 [Pipeline] sh 00:01:09.413 + set -ex 00:01:09.413 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:09.413 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:09.413 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:09.413 ++ SPDK_RUN_UBSAN=1 00:01:09.413 ++ SPDK_TEST_FUZZER=1 00:01:09.413 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:09.413 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:09.413 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:09.413 ++ RUN_NIGHTLY=1 00:01:09.413 + case $SPDK_TEST_NVMF_NICS in 00:01:09.413 + DRIVERS= 00:01:09.413 + [[ -n '' ]] 00:01:09.413 + exit 0 00:01:09.421 [Pipeline] } 00:01:09.435 [Pipeline] // withEnv 00:01:09.439 [Pipeline] } 00:01:09.452 [Pipeline] // stage 00:01:09.461 [Pipeline] catchError 00:01:09.462 [Pipeline] { 00:01:09.475 [Pipeline] timeout 00:01:09.475 Timeout set to expire in 30 min 00:01:09.476 [Pipeline] { 00:01:09.488 [Pipeline] stage 00:01:09.489 [Pipeline] { (Tests) 00:01:09.501 [Pipeline] sh 00:01:09.780 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:09.780 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:09.780 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:09.780 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:09.780 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:09.780 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:09.780 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:09.780 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:09.780 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:09.780 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:09.780 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:09.780 + source /etc/os-release 00:01:09.780 ++ NAME='Fedora Linux' 00:01:09.780 ++ VERSION='38 (Cloud Edition)' 00:01:09.780 ++ ID=fedora 00:01:09.780 ++ VERSION_ID=38 00:01:09.780 ++ VERSION_CODENAME= 00:01:09.780 ++ PLATFORM_ID=platform:f38 00:01:09.780 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:09.780 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:09.780 ++ LOGO=fedora-logo-icon 00:01:09.780 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:09.780 ++ HOME_URL=https://fedoraproject.org/ 00:01:09.780 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:09.780 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:09.780 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:09.780 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:09.780 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:09.780 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:09.780 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:09.780 ++ SUPPORT_END=2024-05-14 00:01:09.780 ++ VARIANT='Cloud Edition' 00:01:09.780 ++ VARIANT_ID=cloud 00:01:09.780 + uname -a 00:01:09.780 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:09.780 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:13.065 Hugepages 00:01:13.065 node hugesize free / total 00:01:13.065 node0 1048576kB 0 / 0 00:01:13.065 node0 2048kB 0 / 0 00:01:13.065 node1 1048576kB 0 / 0 00:01:13.065 node1 2048kB 0 / 0 00:01:13.065 00:01:13.065 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:13.065 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:13.065 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:13.065 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:13.065 + rm -f /tmp/spdk-ld-path 00:01:13.065 + source autorun-spdk.conf 00:01:13.065 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:13.065 ++ SPDK_RUN_UBSAN=1 00:01:13.065 ++ SPDK_TEST_FUZZER=1 00:01:13.065 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:13.065 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:13.065 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:13.065 ++ RUN_NIGHTLY=1 00:01:13.065 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:13.065 + [[ -n '' ]] 00:01:13.065 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:13.065 + for M in /var/spdk/build-*-manifest.txt 00:01:13.065 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:13.065 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:13.065 + for M in /var/spdk/build-*-manifest.txt 00:01:13.065 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:13.065 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:13.065 ++ uname 00:01:13.065 + [[ Linux == \L\i\n\u\x ]] 00:01:13.065 + sudo dmesg -T 00:01:13.065 + sudo dmesg --clear 00:01:13.065 + dmesg_pid=345614 00:01:13.065 + [[ Fedora Linux == FreeBSD ]] 00:01:13.065 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:13.065 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:13.065 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:13.065 + [[ -x /usr/src/fio-static/fio ]] 00:01:13.065 + export FIO_BIN=/usr/src/fio-static/fio 00:01:13.065 + FIO_BIN=/usr/src/fio-static/fio 00:01:13.065 + sudo dmesg -Tw 00:01:13.065 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:13.065 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:13.065 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:13.065 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:13.065 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:13.065 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:13.065 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:13.065 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:13.065 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:13.065 Test configuration: 00:01:13.065 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:13.065 SPDK_RUN_UBSAN=1 00:01:13.065 SPDK_TEST_FUZZER=1 00:01:13.065 SPDK_TEST_FUZZER_SHORT=1 00:01:13.065 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:13.065 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:13.065 RUN_NIGHTLY=1 23:48:02 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:13.065 23:48:02 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:13.065 23:48:02 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:13.065 23:48:02 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:13.065 23:48:02 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:13.065 23:48:02 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:13.065 23:48:02 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:13.065 23:48:02 -- paths/export.sh@5 -- $ export PATH 00:01:13.065 23:48:02 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:13.066 23:48:02 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:13.066 23:48:02 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:13.066 23:48:02 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714081682.XXXXXX 00:01:13.066 23:48:02 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714081682.oans6b 00:01:13.066 23:48:02 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:13.066 23:48:02 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:13.066 23:48:02 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:13.066 23:48:02 -- common/autotest_common.sh@10 -- $ set +x 00:01:13.066 23:48:02 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:13.066 23:48:02 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:13.066 23:48:02 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:13.066 23:48:02 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:13.066 23:48:02 -- spdk/autobuild.sh@16 -- $ date -u 00:01:13.066 Thu Apr 25 09:48:02 PM UTC 2024 00:01:13.066 23:48:02 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:13.066 LTS-24-g36faa8c31 00:01:13.066 23:48:02 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:13.066 23:48:02 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:13.066 23:48:02 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:13.066 23:48:02 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:13.066 23:48:02 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:13.066 23:48:02 -- common/autotest_common.sh@10 -- $ set +x 00:01:13.066 ************************************ 00:01:13.066 START TEST ubsan 00:01:13.066 ************************************ 00:01:13.066 23:48:02 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:13.066 using ubsan 00:01:13.066 00:01:13.066 real 0m0.000s 00:01:13.066 user 0m0.000s 00:01:13.066 sys 0m0.000s 00:01:13.066 23:48:02 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:13.066 23:48:02 -- common/autotest_common.sh@10 -- $ set +x 00:01:13.066 ************************************ 00:01:13.066 END TEST ubsan 00:01:13.066 ************************************ 00:01:13.066 23:48:02 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:13.066 23:48:02 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:13.066 23:48:02 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:13.066 23:48:02 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:13.066 23:48:02 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:13.066 23:48:02 -- common/autotest_common.sh@10 -- $ set +x 00:01:13.066 ************************************ 00:01:13.066 START TEST build_native_dpdk 00:01:13.066 ************************************ 00:01:13.066 23:48:02 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:13.066 23:48:02 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:13.066 23:48:02 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:13.066 23:48:02 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:13.066 23:48:02 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:13.066 23:48:02 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:13.066 23:48:02 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:13.066 23:48:02 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:13.066 23:48:02 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:13.066 23:48:02 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:13.066 23:48:02 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:13.066 23:48:02 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:13.066 23:48:02 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:13.066 23:48:02 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:13.066 23:48:02 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:13.066 23:48:02 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:13.066 23:48:02 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:13.066 23:48:02 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:13.066 eeb0605f11 version: 23.11.0 00:01:13.066 238778122a doc: update release notes for 23.11 00:01:13.066 46aa6b3cfc doc: fix description of RSS features 00:01:13.066 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:13.066 7e421ae345 devtools: support skipping forbid rule check 00:01:13.066 23:48:02 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:13.066 23:48:02 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:13.066 23:48:02 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:13.066 23:48:02 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:13.066 23:48:02 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:13.066 23:48:02 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:13.066 23:48:02 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:13.066 23:48:02 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:13.066 23:48:02 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:13.066 23:48:02 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:13.066 23:48:02 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:13.066 23:48:02 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:13.066 23:48:02 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:13.066 23:48:02 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:13.066 23:48:02 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:13.066 23:48:02 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:13.066 23:48:02 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:13.066 23:48:02 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:13.066 23:48:02 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:13.066 23:48:02 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:13.066 23:48:02 -- scripts/common.sh@343 -- $ case "$op" in 00:01:13.066 23:48:02 -- scripts/common.sh@344 -- $ : 1 00:01:13.066 23:48:02 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:13.066 23:48:02 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:13.066 23:48:02 -- scripts/common.sh@364 -- $ decimal 23 00:01:13.066 23:48:02 -- scripts/common.sh@352 -- $ local d=23 00:01:13.066 23:48:02 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:13.066 23:48:02 -- scripts/common.sh@354 -- $ echo 23 00:01:13.066 23:48:02 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:13.066 23:48:02 -- scripts/common.sh@365 -- $ decimal 21 00:01:13.066 23:48:02 -- scripts/common.sh@352 -- $ local d=21 00:01:13.066 23:48:02 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:13.066 23:48:02 -- scripts/common.sh@354 -- $ echo 21 00:01:13.066 23:48:02 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:13.326 23:48:02 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:13.326 23:48:02 -- scripts/common.sh@366 -- $ return 1 00:01:13.326 23:48:02 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:13.326 patching file config/rte_config.h 00:01:13.326 Hunk #1 succeeded at 60 (offset 1 line). 00:01:13.326 23:48:02 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:13.326 23:48:02 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:13.326 23:48:02 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:13.326 23:48:02 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:13.326 23:48:02 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:18.594 The Meson build system 00:01:18.594 Version: 1.3.1 00:01:18.594 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:18.594 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:18.594 Build type: native build 00:01:18.594 Program cat found: YES (/usr/bin/cat) 00:01:18.594 Project name: DPDK 00:01:18.594 Project version: 23.11.0 00:01:18.594 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:18.594 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:18.594 Host machine cpu family: x86_64 00:01:18.594 Host machine cpu: x86_64 00:01:18.594 Message: ## Building in Developer Mode ## 00:01:18.594 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:18.594 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:18.594 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:18.594 Program python3 found: YES (/usr/bin/python3) 00:01:18.594 Program cat found: YES (/usr/bin/cat) 00:01:18.594 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:18.594 Compiler for C supports arguments -march=native: YES 00:01:18.594 Checking for size of "void *" : 8 00:01:18.594 Checking for size of "void *" : 8 (cached) 00:01:18.594 Library m found: YES 00:01:18.594 Library numa found: YES 00:01:18.594 Has header "numaif.h" : YES 00:01:18.594 Library fdt found: NO 00:01:18.594 Library execinfo found: NO 00:01:18.594 Has header "execinfo.h" : YES 00:01:18.594 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:18.594 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:18.594 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:18.594 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:18.594 Run-time dependency openssl found: YES 3.0.9 00:01:18.594 Run-time dependency libpcap found: YES 1.10.4 00:01:18.594 Has header "pcap.h" with dependency libpcap: YES 00:01:18.594 Compiler for C supports arguments -Wcast-qual: YES 00:01:18.594 Compiler for C supports arguments -Wdeprecated: YES 00:01:18.594 Compiler for C supports arguments -Wformat: YES 00:01:18.594 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:18.594 Compiler for C supports arguments -Wformat-security: NO 00:01:18.594 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:18.594 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:18.594 Compiler for C supports arguments -Wnested-externs: YES 00:01:18.594 Compiler for C supports arguments -Wold-style-definition: YES 00:01:18.594 Compiler for C supports arguments -Wpointer-arith: YES 00:01:18.594 Compiler for C supports arguments -Wsign-compare: YES 00:01:18.594 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:18.594 Compiler for C supports arguments -Wundef: YES 00:01:18.594 Compiler for C supports arguments -Wwrite-strings: YES 00:01:18.594 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:18.594 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:18.594 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:18.594 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:18.594 Program objdump found: YES (/usr/bin/objdump) 00:01:18.594 Compiler for C supports arguments -mavx512f: YES 00:01:18.594 Checking if "AVX512 checking" compiles: YES 00:01:18.594 Fetching value of define "__SSE4_2__" : 1 00:01:18.594 Fetching value of define "__AES__" : 1 00:01:18.594 Fetching value of define "__AVX__" : 1 00:01:18.594 Fetching value of define "__AVX2__" : 1 00:01:18.594 Fetching value of define "__AVX512BW__" : 1 00:01:18.594 Fetching value of define "__AVX512CD__" : 1 00:01:18.594 Fetching value of define "__AVX512DQ__" : 1 00:01:18.594 Fetching value of define "__AVX512F__" : 1 00:01:18.594 Fetching value of define "__AVX512VL__" : 1 00:01:18.594 Fetching value of define "__PCLMUL__" : 1 00:01:18.594 Fetching value of define "__RDRND__" : 1 00:01:18.594 Fetching value of define "__RDSEED__" : 1 00:01:18.594 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:18.594 Fetching value of define "__znver1__" : (undefined) 00:01:18.594 Fetching value of define "__znver2__" : (undefined) 00:01:18.594 Fetching value of define "__znver3__" : (undefined) 00:01:18.594 Fetching value of define "__znver4__" : (undefined) 00:01:18.594 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:18.594 Message: lib/log: Defining dependency "log" 00:01:18.594 Message: lib/kvargs: Defining dependency "kvargs" 00:01:18.594 Message: lib/telemetry: Defining dependency "telemetry" 00:01:18.594 Checking for function "getentropy" : NO 00:01:18.594 Message: lib/eal: Defining dependency "eal" 00:01:18.594 Message: lib/ring: Defining dependency "ring" 00:01:18.594 Message: lib/rcu: Defining dependency "rcu" 00:01:18.594 Message: lib/mempool: Defining dependency "mempool" 00:01:18.594 Message: lib/mbuf: Defining dependency "mbuf" 00:01:18.594 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:18.594 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:18.594 Compiler for C supports arguments -mpclmul: YES 00:01:18.594 Compiler for C supports arguments -maes: YES 00:01:18.594 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:18.594 Compiler for C supports arguments -mavx512bw: YES 00:01:18.594 Compiler for C supports arguments -mavx512dq: YES 00:01:18.594 Compiler for C supports arguments -mavx512vl: YES 00:01:18.594 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:18.594 Compiler for C supports arguments -mavx2: YES 00:01:18.594 Compiler for C supports arguments -mavx: YES 00:01:18.594 Message: lib/net: Defining dependency "net" 00:01:18.594 Message: lib/meter: Defining dependency "meter" 00:01:18.594 Message: lib/ethdev: Defining dependency "ethdev" 00:01:18.594 Message: lib/pci: Defining dependency "pci" 00:01:18.594 Message: lib/cmdline: Defining dependency "cmdline" 00:01:18.594 Message: lib/metrics: Defining dependency "metrics" 00:01:18.594 Message: lib/hash: Defining dependency "hash" 00:01:18.594 Message: lib/timer: Defining dependency "timer" 00:01:18.594 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:18.594 Message: lib/acl: Defining dependency "acl" 00:01:18.594 Message: lib/bbdev: Defining dependency "bbdev" 00:01:18.594 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:18.594 Run-time dependency libelf found: YES 0.190 00:01:18.594 Message: lib/bpf: Defining dependency "bpf" 00:01:18.594 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:18.594 Message: lib/compressdev: Defining dependency "compressdev" 00:01:18.594 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:18.594 Message: lib/distributor: Defining dependency "distributor" 00:01:18.594 Message: lib/dmadev: Defining dependency "dmadev" 00:01:18.594 Message: lib/efd: Defining dependency "efd" 00:01:18.594 Message: lib/eventdev: Defining dependency "eventdev" 00:01:18.594 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:18.594 Message: lib/gpudev: Defining dependency "gpudev" 00:01:18.594 Message: lib/gro: Defining dependency "gro" 00:01:18.594 Message: lib/gso: Defining dependency "gso" 00:01:18.594 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:18.594 Message: lib/jobstats: Defining dependency "jobstats" 00:01:18.594 Message: lib/latencystats: Defining dependency "latencystats" 00:01:18.594 Message: lib/lpm: Defining dependency "lpm" 00:01:18.594 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:18.594 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:18.594 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:18.594 Message: lib/member: Defining dependency "member" 00:01:18.594 Message: lib/pcapng: Defining dependency "pcapng" 00:01:18.595 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:18.595 Message: lib/power: Defining dependency "power" 00:01:18.595 Message: lib/rawdev: Defining dependency "rawdev" 00:01:18.595 Message: lib/regexdev: Defining dependency "regexdev" 00:01:18.595 Message: lib/mldev: Defining dependency "mldev" 00:01:18.595 Message: lib/rib: Defining dependency "rib" 00:01:18.595 Message: lib/reorder: Defining dependency "reorder" 00:01:18.595 Message: lib/sched: Defining dependency "sched" 00:01:18.595 Message: lib/security: Defining dependency "security" 00:01:18.595 Message: lib/stack: Defining dependency "stack" 00:01:18.595 Has header "linux/userfaultfd.h" : YES 00:01:18.595 Has header "linux/vduse.h" : YES 00:01:18.595 Message: lib/vhost: Defining dependency "vhost" 00:01:18.595 Message: lib/ipsec: Defining dependency "ipsec" 00:01:18.595 Message: lib/pdcp: Defining dependency "pdcp" 00:01:18.595 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:18.595 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:18.595 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:18.595 Message: lib/fib: Defining dependency "fib" 00:01:18.595 Message: lib/port: Defining dependency "port" 00:01:18.595 Message: lib/pdump: Defining dependency "pdump" 00:01:18.595 Message: lib/table: Defining dependency "table" 00:01:18.595 Message: lib/pipeline: Defining dependency "pipeline" 00:01:18.595 Message: lib/graph: Defining dependency "graph" 00:01:18.595 Message: lib/node: Defining dependency "node" 00:01:18.595 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:19.165 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:19.165 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:19.165 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:19.165 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:19.165 Compiler for C supports arguments -Wno-unused-value: YES 00:01:19.165 Compiler for C supports arguments -Wno-format: YES 00:01:19.165 Compiler for C supports arguments -Wno-format-security: YES 00:01:19.165 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:19.165 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:19.165 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:19.165 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:19.165 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:19.165 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:19.165 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:19.165 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:19.165 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:19.165 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:19.165 Has header "sys/epoll.h" : YES 00:01:19.165 Program doxygen found: YES (/usr/bin/doxygen) 00:01:19.165 Configuring doxy-api-html.conf using configuration 00:01:19.165 Configuring doxy-api-man.conf using configuration 00:01:19.165 Program mandb found: YES (/usr/bin/mandb) 00:01:19.165 Program sphinx-build found: NO 00:01:19.165 Configuring rte_build_config.h using configuration 00:01:19.165 Message: 00:01:19.165 ================= 00:01:19.165 Applications Enabled 00:01:19.165 ================= 00:01:19.165 00:01:19.165 apps: 00:01:19.165 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:19.165 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:19.165 test-pmd, test-regex, test-sad, test-security-perf, 00:01:19.165 00:01:19.165 Message: 00:01:19.165 ================= 00:01:19.165 Libraries Enabled 00:01:19.165 ================= 00:01:19.165 00:01:19.165 libs: 00:01:19.165 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:19.165 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:19.165 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:19.165 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:19.165 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:19.165 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:19.165 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:19.165 00:01:19.165 00:01:19.165 Message: 00:01:19.165 =============== 00:01:19.165 Drivers Enabled 00:01:19.165 =============== 00:01:19.165 00:01:19.165 common: 00:01:19.165 00:01:19.165 bus: 00:01:19.165 pci, vdev, 00:01:19.165 mempool: 00:01:19.165 ring, 00:01:19.165 dma: 00:01:19.165 00:01:19.165 net: 00:01:19.165 i40e, 00:01:19.165 raw: 00:01:19.165 00:01:19.165 crypto: 00:01:19.165 00:01:19.165 compress: 00:01:19.165 00:01:19.165 regex: 00:01:19.165 00:01:19.165 ml: 00:01:19.165 00:01:19.165 vdpa: 00:01:19.165 00:01:19.165 event: 00:01:19.165 00:01:19.165 baseband: 00:01:19.165 00:01:19.165 gpu: 00:01:19.165 00:01:19.165 00:01:19.165 Message: 00:01:19.165 ================= 00:01:19.165 Content Skipped 00:01:19.165 ================= 00:01:19.165 00:01:19.165 apps: 00:01:19.165 00:01:19.165 libs: 00:01:19.165 00:01:19.165 drivers: 00:01:19.165 common/cpt: not in enabled drivers build config 00:01:19.165 common/dpaax: not in enabled drivers build config 00:01:19.165 common/iavf: not in enabled drivers build config 00:01:19.165 common/idpf: not in enabled drivers build config 00:01:19.165 common/mvep: not in enabled drivers build config 00:01:19.165 common/octeontx: not in enabled drivers build config 00:01:19.165 bus/auxiliary: not in enabled drivers build config 00:01:19.165 bus/cdx: not in enabled drivers build config 00:01:19.165 bus/dpaa: not in enabled drivers build config 00:01:19.165 bus/fslmc: not in enabled drivers build config 00:01:19.165 bus/ifpga: not in enabled drivers build config 00:01:19.165 bus/platform: not in enabled drivers build config 00:01:19.165 bus/vmbus: not in enabled drivers build config 00:01:19.165 common/cnxk: not in enabled drivers build config 00:01:19.165 common/mlx5: not in enabled drivers build config 00:01:19.166 common/nfp: not in enabled drivers build config 00:01:19.166 common/qat: not in enabled drivers build config 00:01:19.166 common/sfc_efx: not in enabled drivers build config 00:01:19.166 mempool/bucket: not in enabled drivers build config 00:01:19.166 mempool/cnxk: not in enabled drivers build config 00:01:19.166 mempool/dpaa: not in enabled drivers build config 00:01:19.166 mempool/dpaa2: not in enabled drivers build config 00:01:19.166 mempool/octeontx: not in enabled drivers build config 00:01:19.166 mempool/stack: not in enabled drivers build config 00:01:19.166 dma/cnxk: not in enabled drivers build config 00:01:19.166 dma/dpaa: not in enabled drivers build config 00:01:19.166 dma/dpaa2: not in enabled drivers build config 00:01:19.166 dma/hisilicon: not in enabled drivers build config 00:01:19.166 dma/idxd: not in enabled drivers build config 00:01:19.166 dma/ioat: not in enabled drivers build config 00:01:19.166 dma/skeleton: not in enabled drivers build config 00:01:19.166 net/af_packet: not in enabled drivers build config 00:01:19.166 net/af_xdp: not in enabled drivers build config 00:01:19.166 net/ark: not in enabled drivers build config 00:01:19.166 net/atlantic: not in enabled drivers build config 00:01:19.166 net/avp: not in enabled drivers build config 00:01:19.166 net/axgbe: not in enabled drivers build config 00:01:19.166 net/bnx2x: not in enabled drivers build config 00:01:19.166 net/bnxt: not in enabled drivers build config 00:01:19.166 net/bonding: not in enabled drivers build config 00:01:19.166 net/cnxk: not in enabled drivers build config 00:01:19.166 net/cpfl: not in enabled drivers build config 00:01:19.166 net/cxgbe: not in enabled drivers build config 00:01:19.166 net/dpaa: not in enabled drivers build config 00:01:19.166 net/dpaa2: not in enabled drivers build config 00:01:19.166 net/e1000: not in enabled drivers build config 00:01:19.166 net/ena: not in enabled drivers build config 00:01:19.166 net/enetc: not in enabled drivers build config 00:01:19.166 net/enetfec: not in enabled drivers build config 00:01:19.166 net/enic: not in enabled drivers build config 00:01:19.166 net/failsafe: not in enabled drivers build config 00:01:19.166 net/fm10k: not in enabled drivers build config 00:01:19.166 net/gve: not in enabled drivers build config 00:01:19.166 net/hinic: not in enabled drivers build config 00:01:19.166 net/hns3: not in enabled drivers build config 00:01:19.166 net/iavf: not in enabled drivers build config 00:01:19.166 net/ice: not in enabled drivers build config 00:01:19.166 net/idpf: not in enabled drivers build config 00:01:19.166 net/igc: not in enabled drivers build config 00:01:19.166 net/ionic: not in enabled drivers build config 00:01:19.166 net/ipn3ke: not in enabled drivers build config 00:01:19.166 net/ixgbe: not in enabled drivers build config 00:01:19.166 net/mana: not in enabled drivers build config 00:01:19.166 net/memif: not in enabled drivers build config 00:01:19.166 net/mlx4: not in enabled drivers build config 00:01:19.166 net/mlx5: not in enabled drivers build config 00:01:19.166 net/mvneta: not in enabled drivers build config 00:01:19.166 net/mvpp2: not in enabled drivers build config 00:01:19.166 net/netvsc: not in enabled drivers build config 00:01:19.166 net/nfb: not in enabled drivers build config 00:01:19.166 net/nfp: not in enabled drivers build config 00:01:19.166 net/ngbe: not in enabled drivers build config 00:01:19.166 net/null: not in enabled drivers build config 00:01:19.166 net/octeontx: not in enabled drivers build config 00:01:19.166 net/octeon_ep: not in enabled drivers build config 00:01:19.166 net/pcap: not in enabled drivers build config 00:01:19.166 net/pfe: not in enabled drivers build config 00:01:19.166 net/qede: not in enabled drivers build config 00:01:19.166 net/ring: not in enabled drivers build config 00:01:19.166 net/sfc: not in enabled drivers build config 00:01:19.166 net/softnic: not in enabled drivers build config 00:01:19.166 net/tap: not in enabled drivers build config 00:01:19.166 net/thunderx: not in enabled drivers build config 00:01:19.166 net/txgbe: not in enabled drivers build config 00:01:19.166 net/vdev_netvsc: not in enabled drivers build config 00:01:19.166 net/vhost: not in enabled drivers build config 00:01:19.166 net/virtio: not in enabled drivers build config 00:01:19.166 net/vmxnet3: not in enabled drivers build config 00:01:19.166 raw/cnxk_bphy: not in enabled drivers build config 00:01:19.166 raw/cnxk_gpio: not in enabled drivers build config 00:01:19.166 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:19.166 raw/ifpga: not in enabled drivers build config 00:01:19.166 raw/ntb: not in enabled drivers build config 00:01:19.166 raw/skeleton: not in enabled drivers build config 00:01:19.166 crypto/armv8: not in enabled drivers build config 00:01:19.166 crypto/bcmfs: not in enabled drivers build config 00:01:19.166 crypto/caam_jr: not in enabled drivers build config 00:01:19.166 crypto/ccp: not in enabled drivers build config 00:01:19.166 crypto/cnxk: not in enabled drivers build config 00:01:19.166 crypto/dpaa_sec: not in enabled drivers build config 00:01:19.166 crypto/dpaa2_sec: not in enabled drivers build config 00:01:19.166 crypto/ipsec_mb: not in enabled drivers build config 00:01:19.166 crypto/mlx5: not in enabled drivers build config 00:01:19.166 crypto/mvsam: not in enabled drivers build config 00:01:19.166 crypto/nitrox: not in enabled drivers build config 00:01:19.166 crypto/null: not in enabled drivers build config 00:01:19.166 crypto/octeontx: not in enabled drivers build config 00:01:19.166 crypto/openssl: not in enabled drivers build config 00:01:19.166 crypto/scheduler: not in enabled drivers build config 00:01:19.166 crypto/uadk: not in enabled drivers build config 00:01:19.166 crypto/virtio: not in enabled drivers build config 00:01:19.166 compress/isal: not in enabled drivers build config 00:01:19.166 compress/mlx5: not in enabled drivers build config 00:01:19.166 compress/octeontx: not in enabled drivers build config 00:01:19.166 compress/zlib: not in enabled drivers build config 00:01:19.166 regex/mlx5: not in enabled drivers build config 00:01:19.166 regex/cn9k: not in enabled drivers build config 00:01:19.166 ml/cnxk: not in enabled drivers build config 00:01:19.166 vdpa/ifc: not in enabled drivers build config 00:01:19.166 vdpa/mlx5: not in enabled drivers build config 00:01:19.166 vdpa/nfp: not in enabled drivers build config 00:01:19.166 vdpa/sfc: not in enabled drivers build config 00:01:19.166 event/cnxk: not in enabled drivers build config 00:01:19.166 event/dlb2: not in enabled drivers build config 00:01:19.166 event/dpaa: not in enabled drivers build config 00:01:19.166 event/dpaa2: not in enabled drivers build config 00:01:19.166 event/dsw: not in enabled drivers build config 00:01:19.166 event/opdl: not in enabled drivers build config 00:01:19.166 event/skeleton: not in enabled drivers build config 00:01:19.166 event/sw: not in enabled drivers build config 00:01:19.166 event/octeontx: not in enabled drivers build config 00:01:19.166 baseband/acc: not in enabled drivers build config 00:01:19.166 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:19.166 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:19.166 baseband/la12xx: not in enabled drivers build config 00:01:19.166 baseband/null: not in enabled drivers build config 00:01:19.166 baseband/turbo_sw: not in enabled drivers build config 00:01:19.166 gpu/cuda: not in enabled drivers build config 00:01:19.166 00:01:19.166 00:01:19.166 Build targets in project: 217 00:01:19.166 00:01:19.166 DPDK 23.11.0 00:01:19.166 00:01:19.166 User defined options 00:01:19.166 libdir : lib 00:01:19.166 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:19.166 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:19.166 c_link_args : 00:01:19.166 enable_docs : false 00:01:19.166 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:19.166 enable_kmods : false 00:01:19.166 machine : native 00:01:19.166 tests : false 00:01:19.166 00:01:19.166 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:19.166 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:19.166 23:48:08 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:19.166 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:19.431 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:19.431 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:19.431 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:19.431 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:19.431 [5/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:19.431 [6/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:19.431 [7/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:19.431 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:19.431 [9/707] Linking static target lib/librte_kvargs.a 00:01:19.431 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:19.431 [11/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:19.431 [12/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:19.431 [13/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:19.431 [14/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:19.431 [15/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:19.431 [16/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:19.431 [17/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:19.431 [18/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:19.431 [19/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:19.691 [20/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:19.691 [21/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:19.691 [22/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:19.691 [23/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:19.691 [24/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:19.691 [25/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:19.691 [26/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:19.691 [27/707] Linking static target lib/librte_pci.a 00:01:19.691 [28/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:19.691 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:19.691 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:19.691 [31/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:19.691 [32/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:19.691 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:19.691 [34/707] Linking static target lib/librte_log.a 00:01:19.691 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:19.691 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:19.953 [37/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:19.953 [38/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:19.953 [39/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:19.953 [40/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:19.953 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:19.953 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:19.953 [43/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:19.953 [44/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:19.953 [45/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:19.953 [46/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:19.953 [47/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:19.953 [48/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:19.953 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:19.953 [50/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:19.953 [51/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:19.953 [52/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:19.953 [53/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:19.953 [54/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:19.953 [55/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:19.953 [56/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:19.953 [57/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:20.214 [58/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:20.214 [59/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:20.214 [60/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:20.214 [61/707] Linking static target lib/librte_meter.a 00:01:20.214 [62/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:20.214 [63/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:20.214 [64/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:20.214 [65/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:20.214 [66/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:20.214 [67/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:20.214 [68/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:20.214 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:20.214 [70/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:20.214 [71/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:20.214 [72/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:20.214 [73/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:20.214 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:20.214 [75/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:20.214 [76/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:20.214 [77/707] Linking static target lib/librte_ring.a 00:01:20.214 [78/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:20.214 [79/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:20.214 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:20.214 [81/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:20.214 [82/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:20.214 [83/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:20.214 [84/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:20.214 [85/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:20.214 [86/707] Linking static target lib/librte_cmdline.a 00:01:20.214 [87/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:20.214 [88/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:20.214 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:20.214 [90/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:20.215 [91/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:20.215 [92/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:20.215 [93/707] Linking static target lib/librte_metrics.a 00:01:20.215 [94/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:20.215 [95/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:20.215 [96/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:20.215 [97/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:20.215 [98/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:20.215 [99/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:20.215 [100/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:20.215 [101/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:20.215 [102/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:20.215 [103/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:20.215 [104/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:20.215 [105/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:20.215 [106/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:20.215 [107/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:20.215 [108/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:20.215 [109/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:20.215 [110/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:20.215 [111/707] Linking static target lib/librte_cfgfile.a 00:01:20.215 [112/707] Linking static target lib/librte_net.a 00:01:20.215 [113/707] Linking static target lib/librte_bitratestats.a 00:01:20.215 [114/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:20.215 [115/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:20.477 [116/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:20.477 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:20.477 [118/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:20.477 [119/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:20.477 [120/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:20.477 [121/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:20.477 [122/707] Linking target lib/librte_log.so.24.0 00:01:20.477 [123/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:20.477 [124/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:20.477 [125/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:20.477 [126/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:20.477 [127/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:20.477 [128/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:20.477 [129/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:20.477 [130/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:20.477 [131/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:20.477 [132/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:20.477 [133/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:20.477 [134/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:20.477 [135/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:20.477 [136/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:20.477 [137/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:20.477 [138/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:20.477 [139/707] Linking static target lib/librte_timer.a 00:01:20.739 [140/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:20.739 [141/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:20.739 [142/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:20.739 [143/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:20.739 [144/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:20.739 [145/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:20.739 [146/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:20.739 [147/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:20.739 [148/707] Linking target lib/librte_kvargs.so.24.0 00:01:20.739 [149/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:20.739 [150/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:20.739 [151/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:20.739 [152/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:20.739 [153/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:20.739 [154/707] Linking static target lib/librte_bbdev.a 00:01:20.739 [155/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:20.739 [156/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:20.739 [157/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:20.739 [158/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:20.739 [159/707] Linking static target lib/librte_mempool.a 00:01:20.739 [160/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:20.739 [161/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:20.739 [162/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:20.739 [163/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:20.739 [164/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:20.739 [165/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:20.739 [166/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:20.739 [167/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:20.739 [168/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:20.739 [169/707] Linking static target lib/librte_jobstats.a 00:01:20.739 [170/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:20.739 [171/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:20.739 [172/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:20.739 [173/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:20.739 [174/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.002 [175/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:21.002 [176/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:21.002 [177/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:21.002 [178/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:21.002 [179/707] Linking static target lib/librte_compressdev.a 00:01:21.002 [180/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:21.002 [181/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:21.002 [182/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:21.002 [183/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:21.002 [184/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:21.002 [185/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:21.002 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:21.002 [187/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:21.002 [188/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:21.002 [189/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:21.002 [190/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:21.002 [191/707] Linking static target lib/librte_dispatcher.a 00:01:21.002 [192/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:21.002 [193/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:21.002 [194/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:21.002 [195/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:21.002 [196/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:21.002 [197/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:21.002 [198/707] Linking static target lib/librte_telemetry.a 00:01:21.002 [199/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:21.002 [200/707] Linking static target lib/librte_latencystats.a 00:01:21.002 [201/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:21.002 [202/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:21.002 [203/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:21.002 [204/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.002 [205/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:21.002 [206/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:21.002 [207/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:21.002 [208/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:21.264 [209/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:21.264 [210/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:21.264 [211/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:21.264 [212/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:21.264 [213/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:21.264 [214/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:21.264 [215/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:21.264 [216/707] Linking static target lib/librte_rcu.a 00:01:21.264 [217/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:21.264 [218/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:21.264 [219/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:21.264 [220/707] Linking static target lib/librte_gro.a 00:01:21.264 [221/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:21.264 [222/707] Linking static target lib/librte_gpudev.a 00:01:21.264 [223/707] Linking static target lib/librte_dmadev.a 00:01:21.264 [224/707] Linking static target lib/librte_eal.a 00:01:21.264 [225/707] Linking static target lib/librte_stack.a 00:01:21.264 [226/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:21.264 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:21.264 [228/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:21.264 [229/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:21.264 [230/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:21.264 [231/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:21.264 [232/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:21.264 [233/707] Linking static target lib/librte_regexdev.a 00:01:21.264 [234/707] Linking static target lib/librte_distributor.a 00:01:21.264 [235/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:21.264 [236/707] Linking static target lib/librte_gso.a 00:01:21.264 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:21.265 [238/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:21.265 [239/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:21.265 [240/707] Linking static target lib/librte_mldev.a 00:01:21.265 [241/707] Linking static target lib/librte_rawdev.a 00:01:21.265 [242/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.265 [243/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:21.265 [244/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:21.265 [245/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:21.265 [246/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:21.265 [247/707] Linking static target lib/librte_mbuf.a 00:01:21.265 [248/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:21.265 [249/707] Linking static target lib/librte_ip_frag.a 00:01:21.265 [250/707] Linking static target lib/librte_power.a 00:01:21.527 [251/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:21.528 [252/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:21.528 [253/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:21.528 [254/707] Linking static target lib/librte_pcapng.a 00:01:21.528 [255/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.528 [256/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:21.528 [257/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:21.528 [258/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:21.528 [259/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:21.528 [260/707] Linking static target lib/librte_reorder.a 00:01:21.528 [261/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:21.528 [262/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:21.528 [263/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.528 [264/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:21.528 [265/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:21.528 [266/707] Linking static target lib/librte_bpf.a 00:01:21.528 [267/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:21.528 [268/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:21.528 [269/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.528 [270/707] Linking static target lib/librte_security.a 00:01:21.528 [271/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.528 [272/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:21.528 [273/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:21.528 [274/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.528 [275/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.528 [276/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:21.790 [277/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:21.790 [278/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:21.790 [279/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:21.790 [280/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:21.790 [281/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:21.790 [282/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.790 [283/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:21.790 [284/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [285/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [286/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:21.791 [287/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:21.791 [288/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:21.791 [289/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [290/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:21.791 [291/707] Linking static target lib/librte_rib.a 00:01:21.791 [292/707] Linking static target lib/librte_lpm.a 00:01:21.791 [293/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:21.791 [294/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [295/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:21.791 [296/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [297/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [298/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:21.791 [299/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [300/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:21.791 [301/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:21.791 [302/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:21.791 [303/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:21.791 [304/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:21.791 [305/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:21.791 [306/707] Linking target lib/librte_telemetry.so.24.0 00:01:21.791 [307/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:22.056 [308/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:22.056 [309/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:22.056 [310/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:22.056 [311/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:22.056 [312/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:22.056 [313/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:22.056 [314/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:22.056 [315/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.056 [316/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.056 [317/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:22.056 [318/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:22.056 [319/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:22.056 [320/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:22.056 [321/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.056 [322/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:22.056 [323/707] Linking static target lib/librte_efd.a 00:01:22.056 [324/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:22.056 [325/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:22.056 [326/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:22.056 [327/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:22.056 [328/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:22.056 [329/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:22.056 [330/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:22.056 [331/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:22.056 [332/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:22.056 [333/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:22.326 [334/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:22.326 [335/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:22.326 [336/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:22.326 [337/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:22.326 [338/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.326 [339/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:22.326 [340/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:22.326 [341/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.326 [342/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:22.326 [343/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:22.326 [344/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:22.326 [345/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:22.326 [346/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:22.326 [347/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:22.326 [348/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:22.326 [349/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:22.326 [350/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:22.326 [351/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.326 [352/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:22.326 [353/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:22.326 [354/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:22.326 [355/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.326 [356/707] Linking static target lib/librte_fib.a 00:01:22.326 [357/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:22.591 [358/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:22.591 [359/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:22.591 [360/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:22.591 [361/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:22.591 [362/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:22.591 [363/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.591 [364/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.591 [365/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.591 [366/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:22.591 [367/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:22.591 [368/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:22.591 [369/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:22.591 [370/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:22.591 [371/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:22.591 [372/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:22.591 [373/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:22.591 [374/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:22.591 [375/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:22.591 [376/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:22.591 [377/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:22.591 [378/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:22.591 [379/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:22.591 [380/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:22.855 [381/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:22.855 [382/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:22.855 [383/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:22.855 [384/707] Linking static target lib/librte_pdump.a 00:01:22.855 [385/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:22.855 [386/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:22.855 [387/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:22.855 [388/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:22.855 [389/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:22.855 [390/707] Linking static target lib/librte_graph.a 00:01:22.855 [391/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:22.855 [392/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:22.855 [393/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:22.855 [394/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:22.855 [395/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:22.855 [396/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:22.855 [397/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:22.855 [398/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:22.855 [399/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:22.855 [400/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:22.855 [401/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:22.855 [402/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:22.855 [403/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:22.855 [404/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:23.118 [405/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:23.118 [406/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:23.118 [407/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.118 [408/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:23.118 [409/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:23.118 [410/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:23.118 [411/707] Linking static target drivers/librte_bus_vdev.a 00:01:23.118 [412/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:23.118 [413/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:23.118 [414/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:23.118 [415/707] Linking static target lib/librte_table.a 00:01:23.118 [416/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:23.118 [417/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:23.118 [418/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:23.118 [419/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:23.118 [420/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:23.118 [421/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:23.118 [422/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:23.118 [423/707] Linking static target lib/librte_sched.a 00:01:23.118 [424/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:23.118 [425/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:23.118 [426/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:23.118 [427/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:23.118 [428/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:23.118 [429/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:23.118 [430/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.377 [431/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:23.377 [432/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:23.377 [433/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:23.377 [434/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:23.377 [435/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:23.377 [436/707] Linking static target lib/librte_cryptodev.a 00:01:23.377 [437/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:23.377 [438/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:23.377 [439/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:23.377 [440/707] Linking static target drivers/librte_bus_pci.a 00:01:23.377 [441/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:23.377 [442/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:23.377 [443/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:23.377 [444/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:23.377 [445/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:23.377 [446/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:23.377 [447/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:23.377 [448/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:23.377 [449/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.377 [450/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.377 [451/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:23.637 [452/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:23.637 [453/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:23.637 [454/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:23.637 [455/707] Linking static target lib/librte_ipsec.a 00:01:23.637 [456/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:23.637 [457/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:23.637 [458/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:23.637 [459/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:23.637 [460/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:23.637 [461/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:23.637 [462/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:23.637 [463/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:23.637 [464/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:23.637 [465/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:23.637 [466/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:23.637 [467/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:23.637 [468/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:23.637 [469/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:23.637 [470/707] Linking static target lib/librte_member.a 00:01:23.637 [471/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:23.637 [472/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:23.637 [473/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:23.637 [474/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:23.637 [475/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:23.637 [476/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:23.637 [477/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:23.637 [478/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:23.637 [479/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.637 [480/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:23.637 [481/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:23.637 [482/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:23.894 [483/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:23.894 [484/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:23.894 [485/707] Linking static target lib/librte_port.a 00:01:23.894 [486/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:23.894 [487/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:23.894 [488/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:23.894 [489/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:23.894 [490/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:23.894 [491/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:23.894 [492/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:23.894 [493/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:23.894 [494/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:23.894 [495/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:23.894 [496/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.894 [497/707] Linking static target drivers/librte_mempool_ring.a 00:01:23.894 [498/707] Linking static target lib/librte_node.a 00:01:23.894 [499/707] Linking static target lib/librte_pdcp.a 00:01:23.894 [500/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:23.894 [501/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:23.894 [502/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:23.894 [503/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:23.894 [504/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:23.894 [505/707] Linking static target lib/acl/libavx2_tmp.a 00:01:23.894 [506/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:23.894 [507/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:23.894 [508/707] Linking static target lib/librte_hash.a 00:01:23.894 [509/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:23.894 [510/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:23.894 [511/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.894 [512/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:23.894 [513/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.894 [514/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:23.894 [515/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:23.894 [516/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.894 [517/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:23.894 [518/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:23.894 [519/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:23.894 [520/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:23.894 [521/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.894 [522/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:23.894 [523/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:23.894 [524/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:23.894 [525/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:23.894 [526/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:24.153 [527/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:24.153 [528/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:24.153 [529/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:24.153 [530/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:24.153 [531/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:24.153 [532/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:24.153 [533/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:24.153 [534/707] Linking static target lib/librte_eventdev.a 00:01:24.153 [535/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:24.153 [536/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.153 [537/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:24.153 [538/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.153 [539/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:24.153 [540/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:24.153 [541/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:24.153 [542/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:24.153 [543/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:24.153 [544/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:24.410 [545/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:24.410 [546/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:24.410 [547/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:24.410 [548/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:24.410 [549/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:24.410 [550/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:24.410 [551/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.410 [552/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:24.410 [553/707] Linking static target lib/librte_acl.a 00:01:24.410 [554/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:24.410 [555/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:24.410 [556/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:24.410 [557/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:24.410 [558/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:24.410 [559/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:24.410 [560/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:24.410 [561/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:24.410 [562/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:24.668 [563/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:24.668 [564/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.668 [565/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:24.668 [566/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:24.668 [567/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:24.668 [568/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:24.668 [569/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.926 [570/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:24.926 [571/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:24.926 [572/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.926 [573/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:24.926 [574/707] Linking static target lib/librte_ethdev.a 00:01:25.182 [575/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:25.440 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:25.440 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:25.440 [578/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:25.697 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:25.955 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:26.213 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:26.213 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:26.472 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:26.472 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:26.472 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:26.472 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:26.729 [587/707] Linking static target drivers/librte_net_i40e.a 00:01:26.986 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:27.551 [589/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.809 [590/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.809 [591/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:28.376 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:33.641 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.641 [594/707] Linking target lib/librte_eal.so.24.0 00:01:33.641 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:33.641 [596/707] Linking target drivers/librte_bus_vdev.so.24.0 00:01:33.641 [597/707] Linking target lib/librte_acl.so.24.0 00:01:33.641 [598/707] Linking target lib/librte_ring.so.24.0 00:01:33.641 [599/707] Linking target lib/librte_meter.so.24.0 00:01:33.641 [600/707] Linking target lib/librte_dmadev.so.24.0 00:01:33.641 [601/707] Linking target lib/librte_pci.so.24.0 00:01:33.641 [602/707] Linking target lib/librte_timer.so.24.0 00:01:33.641 [603/707] Linking target lib/librte_cfgfile.so.24.0 00:01:33.641 [604/707] Linking target lib/librte_jobstats.so.24.0 00:01:33.641 [605/707] Linking target lib/librte_stack.so.24.0 00:01:33.641 [606/707] Linking target lib/librte_rawdev.so.24.0 00:01:33.641 [607/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:33.641 [608/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:33.641 [609/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:33.641 [610/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:01:33.641 [611/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:01:33.641 [612/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:33.641 [613/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:33.641 [614/707] Linking target drivers/librte_bus_pci.so.24.0 00:01:33.641 [615/707] Linking target lib/librte_rcu.so.24.0 00:01:33.641 [616/707] Linking target lib/librte_mempool.so.24.0 00:01:33.641 [617/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:01:33.642 [618/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:33.642 [619/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:33.642 [620/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.642 [621/707] Linking target lib/librte_rib.so.24.0 00:01:33.642 [622/707] Linking target drivers/librte_mempool_ring.so.24.0 00:01:33.642 [623/707] Linking target lib/librte_mbuf.so.24.0 00:01:33.642 [624/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:01:33.642 [625/707] Linking target lib/librte_fib.so.24.0 00:01:33.642 [626/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:33.900 [627/707] Linking target lib/librte_bbdev.so.24.0 00:01:33.900 [628/707] Linking target lib/librte_gpudev.so.24.0 00:01:33.900 [629/707] Linking target lib/librte_regexdev.so.24.0 00:01:33.900 [630/707] Linking target lib/librte_net.so.24.0 00:01:33.900 [631/707] Linking target lib/librte_mldev.so.24.0 00:01:33.900 [632/707] Linking target lib/librte_compressdev.so.24.0 00:01:33.900 [633/707] Linking target lib/librte_distributor.so.24.0 00:01:33.900 [634/707] Linking target lib/librte_reorder.so.24.0 00:01:33.900 [635/707] Linking target lib/librte_cryptodev.so.24.0 00:01:33.900 [636/707] Linking target lib/librte_sched.so.24.0 00:01:33.900 [637/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:01:33.900 [638/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:33.900 [639/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:01:33.900 [640/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:33.900 [641/707] Linking target lib/librte_cmdline.so.24.0 00:01:33.900 [642/707] Linking target lib/librte_hash.so.24.0 00:01:33.900 [643/707] Linking target lib/librte_security.so.24.0 00:01:33.900 [644/707] Linking target lib/librte_ethdev.so.24.0 00:01:34.158 [645/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:34.159 [646/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:01:34.159 [647/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:34.159 [648/707] Linking target lib/librte_lpm.so.24.0 00:01:34.159 [649/707] Linking target lib/librte_efd.so.24.0 00:01:34.159 [650/707] Linking target lib/librte_member.so.24.0 00:01:34.159 [651/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:34.159 [652/707] Linking target lib/librte_metrics.so.24.0 00:01:34.159 [653/707] Linking target lib/librte_ipsec.so.24.0 00:01:34.159 [654/707] Linking target lib/librte_pdcp.so.24.0 00:01:34.159 [655/707] Linking target lib/librte_gso.so.24.0 00:01:34.159 [656/707] Linking target lib/librte_bpf.so.24.0 00:01:34.159 [657/707] Linking target lib/librte_pcapng.so.24.0 00:01:34.159 [658/707] Linking target lib/librte_eventdev.so.24.0 00:01:34.159 [659/707] Linking target lib/librte_gro.so.24.0 00:01:34.159 [660/707] Linking target lib/librte_ip_frag.so.24.0 00:01:34.159 [661/707] Linking static target lib/librte_pipeline.a 00:01:34.159 [662/707] Linking target lib/librte_power.so.24.0 00:01:34.159 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:01:34.159 [664/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:01:34.417 [665/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:01:34.417 [666/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:01:34.417 [667/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:01:34.417 [668/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:01:34.417 [669/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:01:34.417 [670/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:01:34.417 [671/707] Linking target lib/librte_latencystats.so.24.0 00:01:34.417 [672/707] Linking target lib/librte_bitratestats.so.24.0 00:01:34.417 [673/707] Linking target lib/librte_dispatcher.so.24.0 00:01:34.417 [674/707] Linking target lib/librte_pdump.so.24.0 00:01:34.417 [675/707] Linking target lib/librte_graph.so.24.0 00:01:34.417 [676/707] Linking target lib/librte_port.so.24.0 00:01:34.417 [677/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:01:34.417 [678/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:01:34.676 [679/707] Linking target lib/librte_node.so.24.0 00:01:34.676 [680/707] Linking target lib/librte_table.so.24.0 00:01:34.676 [681/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:34.676 [682/707] Linking static target lib/librte_vhost.a 00:01:34.676 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:01:35.242 [684/707] Linking target app/dpdk-pdump 00:01:35.242 [685/707] Linking target app/dpdk-test-acl 00:01:35.242 [686/707] Linking target app/dpdk-test-gpudev 00:01:35.242 [687/707] Linking target app/dpdk-test-compress-perf 00:01:35.242 [688/707] Linking target app/dpdk-dumpcap 00:01:35.242 [689/707] Linking target app/dpdk-test-pipeline 00:01:35.242 [690/707] Linking target app/dpdk-test-cmdline 00:01:35.242 [691/707] Linking target app/dpdk-graph 00:01:35.242 [692/707] Linking target app/dpdk-test-flow-perf 00:01:35.242 [693/707] Linking target app/dpdk-test-regex 00:01:35.242 [694/707] Linking target app/dpdk-test-crypto-perf 00:01:35.242 [695/707] Linking target app/dpdk-proc-info 00:01:35.242 [696/707] Linking target app/dpdk-test-dma-perf 00:01:35.242 [697/707] Linking target app/dpdk-test-mldev 00:01:35.242 [698/707] Linking target app/dpdk-test-fib 00:01:35.242 [699/707] Linking target app/dpdk-test-sad 00:01:35.242 [700/707] Linking target app/dpdk-test-bbdev 00:01:35.242 [701/707] Linking target app/dpdk-test-security-perf 00:01:35.242 [702/707] Linking target app/dpdk-test-eventdev 00:01:35.243 [703/707] Linking target app/dpdk-testpmd 00:01:36.618 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.877 [705/707] Linking target lib/librte_vhost.so.24.0 00:01:40.169 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.169 [707/707] Linking target lib/librte_pipeline.so.24.0 00:01:40.169 23:48:29 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:40.169 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:40.169 [0/1] Installing files. 00:01:40.169 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.169 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.433 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.434 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.435 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:40.436 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:40.437 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:40.438 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:40.438 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.438 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.438 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.438 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.438 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.439 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:40.703 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:40.703 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:40.703 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:40.703 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:01:40.703 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.703 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.704 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:40.705 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:40.705 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:01:40.705 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:01:40.705 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:01:40.705 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:40.705 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:01:40.705 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:40.705 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:01:40.705 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:40.705 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:01:40.705 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:40.705 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:01:40.705 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:40.705 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:01:40.705 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:40.705 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:01:40.705 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:40.705 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:01:40.705 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:40.705 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:01:40.705 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:40.705 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:01:40.705 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:40.705 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:01:40.705 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:40.705 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:01:40.705 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:40.705 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:01:40.705 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:40.705 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:01:40.705 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:40.705 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:01:40.705 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:40.705 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:01:40.705 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:40.706 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:01:40.706 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:40.706 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:01:40.706 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:40.706 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:01:40.706 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:40.706 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:01:40.706 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:40.706 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:01:40.706 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:40.706 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:01:40.706 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:40.706 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:01:40.706 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:40.706 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:01:40.706 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:40.706 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:01:40.706 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:40.706 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:01:40.706 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:40.706 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:01:40.706 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:01:40.706 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:01:40.706 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:40.706 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:01:40.706 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:40.706 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:01:40.706 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:40.706 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:01:40.706 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:40.706 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:01:40.706 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:40.706 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:01:40.706 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:40.706 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:01:40.706 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:40.706 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:01:40.706 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:40.706 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:01:40.706 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:40.706 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:01:40.706 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:40.706 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:01:40.706 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:40.706 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:01:40.706 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:40.706 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:01:40.706 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:01:40.706 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:01:40.706 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:40.706 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:01:40.706 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:40.706 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:01:40.706 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:40.706 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:01:40.706 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:40.706 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:01:40.706 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:40.706 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:01:40.706 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:40.706 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:01:40.706 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:40.706 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:01:40.706 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:01:40.706 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:01:40.706 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:40.706 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:01:40.706 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:40.706 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:01:40.706 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:40.706 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:01:40.706 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:40.706 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:01:40.706 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:40.706 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:01:40.706 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:40.706 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:01:40.706 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:01:40.706 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:01:40.706 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:01:40.706 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:01:40.706 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:01:40.706 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:01:40.706 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:01:40.706 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:01:40.706 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:01:40.706 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:01:40.706 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:01:40.706 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:01:40.706 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:40.706 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:01:40.706 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:01:40.706 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:01:40.706 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:01:40.706 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:01:40.706 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:01:40.706 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:01:40.706 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:01:40.706 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:01:40.706 23:48:30 -- common/autobuild_common.sh@189 -- $ uname -s 00:01:40.706 23:48:30 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:40.706 23:48:30 -- common/autobuild_common.sh@200 -- $ cat 00:01:40.706 23:48:30 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:40.706 00:01:40.706 real 0m27.626s 00:01:40.706 user 7m59.192s 00:01:40.706 sys 2m31.344s 00:01:40.706 23:48:30 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:40.706 23:48:30 -- common/autotest_common.sh@10 -- $ set +x 00:01:40.706 ************************************ 00:01:40.706 END TEST build_native_dpdk 00:01:40.706 ************************************ 00:01:40.706 23:48:30 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:40.706 23:48:30 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:40.706 23:48:30 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:40.706 23:48:30 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:40.706 23:48:30 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:40.706 23:48:30 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:40.706 23:48:30 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:40.706 23:48:30 -- common/autotest_common.sh@10 -- $ set +x 00:01:40.706 ************************************ 00:01:40.706 START TEST autobuild_llvm_precompile 00:01:40.706 ************************************ 00:01:40.706 23:48:30 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:01:40.706 23:48:30 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:40.965 23:48:30 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:01:40.965 Target: x86_64-redhat-linux-gnu 00:01:40.965 Thread model: posix 00:01:40.965 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:40.965 23:48:30 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:01:40.965 23:48:30 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:01:40.965 23:48:30 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:01:40.965 23:48:30 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:01:40.965 23:48:30 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:01:40.965 23:48:30 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:01:40.965 23:48:30 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:40.965 23:48:30 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:01:40.965 23:48:30 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:01:40.966 23:48:30 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:41.225 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:41.225 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.225 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:41.483 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:41.742 Using 'verbs' RDMA provider 00:01:57.600 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:09.814 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:10.073 Creating mk/config.mk...done. 00:02:10.073 Creating mk/cc.flags.mk...done. 00:02:10.073 Type 'make' to build. 00:02:10.073 00:02:10.073 real 0m29.209s 00:02:10.073 user 0m12.405s 00:02:10.073 sys 0m16.143s 00:02:10.073 23:48:59 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:10.073 23:48:59 -- common/autotest_common.sh@10 -- $ set +x 00:02:10.073 ************************************ 00:02:10.073 END TEST autobuild_llvm_precompile 00:02:10.073 ************************************ 00:02:10.073 23:48:59 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:10.073 23:48:59 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:10.073 23:48:59 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:10.073 23:48:59 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:10.073 23:48:59 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:10.332 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:10.590 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:10.590 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:10.590 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:11.158 Using 'verbs' RDMA provider 00:02:23.929 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:36.132 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:36.132 Creating mk/config.mk...done. 00:02:36.132 Creating mk/cc.flags.mk...done. 00:02:36.132 Type 'make' to build. 00:02:36.132 23:49:25 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:36.132 23:49:25 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:36.132 23:49:25 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:36.132 23:49:25 -- common/autotest_common.sh@10 -- $ set +x 00:02:36.132 ************************************ 00:02:36.132 START TEST make 00:02:36.132 ************************************ 00:02:36.132 23:49:25 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:36.132 make[1]: Nothing to be done for 'all'. 00:02:38.037 The Meson build system 00:02:38.037 Version: 1.3.1 00:02:38.037 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:38.037 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:38.037 Build type: native build 00:02:38.037 Project name: libvfio-user 00:02:38.037 Project version: 0.0.1 00:02:38.037 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:38.037 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:38.037 Host machine cpu family: x86_64 00:02:38.037 Host machine cpu: x86_64 00:02:38.037 Run-time dependency threads found: YES 00:02:38.037 Library dl found: YES 00:02:38.037 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:38.037 Run-time dependency json-c found: YES 0.17 00:02:38.037 Run-time dependency cmocka found: YES 1.1.7 00:02:38.037 Program pytest-3 found: NO 00:02:38.037 Program flake8 found: NO 00:02:38.037 Program misspell-fixer found: NO 00:02:38.037 Program restructuredtext-lint found: NO 00:02:38.037 Program valgrind found: YES (/usr/bin/valgrind) 00:02:38.037 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:38.037 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:38.037 Compiler for C supports arguments -Wwrite-strings: YES 00:02:38.037 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:38.037 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:38.037 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:38.037 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:38.037 Build targets in project: 8 00:02:38.037 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:38.037 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:38.037 00:02:38.037 libvfio-user 0.0.1 00:02:38.037 00:02:38.037 User defined options 00:02:38.037 buildtype : debug 00:02:38.038 default_library: static 00:02:38.038 libdir : /usr/local/lib 00:02:38.038 00:02:38.038 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:38.038 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:38.038 [1/36] Compiling C object samples/null.p/null.c.o 00:02:38.296 [2/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:38.296 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:38.296 [4/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:38.296 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:38.296 [6/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:38.296 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:38.296 [8/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:38.296 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:38.296 [10/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:38.296 [11/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:38.296 [12/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:38.296 [13/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:38.296 [14/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:38.296 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:38.296 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:38.296 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:38.296 [18/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:38.296 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:38.296 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:38.296 [21/36] Compiling C object samples/server.p/server.c.o 00:02:38.296 [22/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:38.296 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:38.296 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:38.296 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:38.296 [26/36] Compiling C object samples/client.p/client.c.o 00:02:38.296 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:38.296 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:38.296 [29/36] Linking static target lib/libvfio-user.a 00:02:38.296 [30/36] Linking target samples/client 00:02:38.296 [31/36] Linking target samples/server 00:02:38.296 [32/36] Linking target test/unit_tests 00:02:38.296 [33/36] Linking target samples/gpio-pci-idio-16 00:02:38.296 [34/36] Linking target samples/shadow_ioeventfd_server 00:02:38.296 [35/36] Linking target samples/null 00:02:38.296 [36/36] Linking target samples/lspci 00:02:38.296 INFO: autodetecting backend as ninja 00:02:38.296 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:38.296 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:38.863 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:38.863 ninja: no work to do. 00:02:42.150 CC lib/ut/ut.o 00:02:42.150 CC lib/ut_mock/mock.o 00:02:42.150 CC lib/log/log.o 00:02:42.150 CC lib/log/log_deprecated.o 00:02:42.150 CC lib/log/log_flags.o 00:02:42.150 LIB libspdk_ut.a 00:02:42.150 LIB libspdk_ut_mock.a 00:02:42.150 LIB libspdk_log.a 00:02:42.409 CC lib/ioat/ioat.o 00:02:42.409 CC lib/dma/dma.o 00:02:42.409 CC lib/util/base64.o 00:02:42.409 CXX lib/trace_parser/trace.o 00:02:42.409 CC lib/util/cpuset.o 00:02:42.409 CC lib/util/bit_array.o 00:02:42.409 CC lib/util/crc16.o 00:02:42.409 CC lib/util/crc32.o 00:02:42.409 CC lib/util/crc32c.o 00:02:42.409 CC lib/util/crc32_ieee.o 00:02:42.409 CC lib/util/crc64.o 00:02:42.409 CC lib/util/dif.o 00:02:42.409 CC lib/util/fd.o 00:02:42.409 CC lib/util/file.o 00:02:42.409 CC lib/util/hexlify.o 00:02:42.409 CC lib/util/iov.o 00:02:42.409 CC lib/util/math.o 00:02:42.409 CC lib/util/pipe.o 00:02:42.409 CC lib/util/strerror_tls.o 00:02:42.409 CC lib/util/string.o 00:02:42.409 CC lib/util/fd_group.o 00:02:42.409 CC lib/util/uuid.o 00:02:42.409 CC lib/util/zipf.o 00:02:42.409 CC lib/util/xor.o 00:02:42.409 CC lib/vfio_user/host/vfio_user.o 00:02:42.409 CC lib/vfio_user/host/vfio_user_pci.o 00:02:42.409 LIB libspdk_dma.a 00:02:42.668 LIB libspdk_ioat.a 00:02:42.668 LIB libspdk_vfio_user.a 00:02:42.668 LIB libspdk_util.a 00:02:42.927 LIB libspdk_trace_parser.a 00:02:42.927 CC lib/vmd/vmd.o 00:02:42.927 CC lib/vmd/led.o 00:02:42.927 CC lib/env_dpdk/env.o 00:02:42.927 CC lib/env_dpdk/memory.o 00:02:42.927 CC lib/env_dpdk/pci.o 00:02:42.927 CC lib/json/json_util.o 00:02:42.927 CC lib/env_dpdk/init.o 00:02:42.927 CC lib/json/json_parse.o 00:02:42.927 CC lib/env_dpdk/threads.o 00:02:42.927 CC lib/idxd/idxd.o 00:02:42.927 CC lib/env_dpdk/pci_ioat.o 00:02:42.927 CC lib/json/json_write.o 00:02:42.927 CC lib/env_dpdk/pci_idxd.o 00:02:42.927 CC lib/env_dpdk/pci_virtio.o 00:02:42.927 CC lib/env_dpdk/pci_event.o 00:02:42.927 CC lib/idxd/idxd_user.o 00:02:42.927 CC lib/env_dpdk/pci_vmd.o 00:02:42.927 CC lib/env_dpdk/pci_dpdk.o 00:02:42.927 CC lib/env_dpdk/sigbus_handler.o 00:02:42.927 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:42.927 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:42.927 CC lib/rdma/common.o 00:02:42.927 CC lib/rdma/rdma_verbs.o 00:02:42.927 CC lib/conf/conf.o 00:02:43.185 LIB libspdk_conf.a 00:02:43.185 LIB libspdk_json.a 00:02:43.185 LIB libspdk_rdma.a 00:02:43.442 LIB libspdk_idxd.a 00:02:43.442 LIB libspdk_vmd.a 00:02:43.442 CC lib/jsonrpc/jsonrpc_server.o 00:02:43.442 CC lib/jsonrpc/jsonrpc_client.o 00:02:43.442 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:43.442 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:43.699 LIB libspdk_jsonrpc.a 00:02:43.957 LIB libspdk_env_dpdk.a 00:02:43.957 CC lib/rpc/rpc.o 00:02:44.215 LIB libspdk_rpc.a 00:02:44.473 CC lib/notify/notify.o 00:02:44.473 CC lib/notify/notify_rpc.o 00:02:44.473 CC lib/trace/trace.o 00:02:44.473 CC lib/trace/trace_flags.o 00:02:44.473 CC lib/trace/trace_rpc.o 00:02:44.473 CC lib/sock/sock.o 00:02:44.473 CC lib/sock/sock_rpc.o 00:02:44.473 LIB libspdk_notify.a 00:02:44.473 LIB libspdk_trace.a 00:02:44.731 LIB libspdk_sock.a 00:02:44.731 CC lib/thread/thread.o 00:02:44.731 CC lib/thread/iobuf.o 00:02:44.988 CC lib/nvme/nvme_fabric.o 00:02:44.988 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:44.988 CC lib/nvme/nvme_ctrlr.o 00:02:44.988 CC lib/nvme/nvme_ns_cmd.o 00:02:44.988 CC lib/nvme/nvme_ns.o 00:02:44.988 CC lib/nvme/nvme_pcie_common.o 00:02:44.988 CC lib/nvme/nvme_pcie.o 00:02:44.988 CC lib/nvme/nvme_qpair.o 00:02:44.988 CC lib/nvme/nvme.o 00:02:44.988 CC lib/nvme/nvme_quirks.o 00:02:44.988 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:44.988 CC lib/nvme/nvme_transport.o 00:02:44.988 CC lib/nvme/nvme_discovery.o 00:02:44.988 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:44.988 CC lib/nvme/nvme_tcp.o 00:02:44.988 CC lib/nvme/nvme_io_msg.o 00:02:44.988 CC lib/nvme/nvme_opal.o 00:02:44.988 CC lib/nvme/nvme_poll_group.o 00:02:44.988 CC lib/nvme/nvme_zns.o 00:02:44.988 CC lib/nvme/nvme_cuse.o 00:02:44.988 CC lib/nvme/nvme_vfio_user.o 00:02:44.988 CC lib/nvme/nvme_rdma.o 00:02:45.553 LIB libspdk_thread.a 00:02:45.811 CC lib/blob/blobstore.o 00:02:45.811 CC lib/blob/request.o 00:02:45.811 CC lib/blob/zeroes.o 00:02:45.811 CC lib/blob/blob_bs_dev.o 00:02:45.811 CC lib/init/json_config.o 00:02:45.811 CC lib/init/subsystem.o 00:02:46.069 CC lib/init/subsystem_rpc.o 00:02:46.069 CC lib/init/rpc.o 00:02:46.069 CC lib/vfu_tgt/tgt_endpoint.o 00:02:46.069 CC lib/virtio/virtio.o 00:02:46.069 CC lib/accel/accel.o 00:02:46.069 CC lib/vfu_tgt/tgt_rpc.o 00:02:46.069 CC lib/virtio/virtio_vhost_user.o 00:02:46.069 CC lib/accel/accel_rpc.o 00:02:46.069 CC lib/virtio/virtio_vfio_user.o 00:02:46.069 CC lib/accel/accel_sw.o 00:02:46.069 CC lib/virtio/virtio_pci.o 00:02:46.069 LIB libspdk_init.a 00:02:46.069 LIB libspdk_virtio.a 00:02:46.069 LIB libspdk_vfu_tgt.a 00:02:46.069 LIB libspdk_nvme.a 00:02:46.327 CC lib/event/app.o 00:02:46.327 CC lib/event/reactor.o 00:02:46.327 CC lib/event/log_rpc.o 00:02:46.327 CC lib/event/app_rpc.o 00:02:46.327 CC lib/event/scheduler_static.o 00:02:46.586 LIB libspdk_accel.a 00:02:46.586 LIB libspdk_event.a 00:02:46.845 CC lib/bdev/bdev.o 00:02:46.845 CC lib/bdev/bdev_rpc.o 00:02:46.845 CC lib/bdev/part.o 00:02:46.845 CC lib/bdev/bdev_zone.o 00:02:46.846 CC lib/bdev/scsi_nvme.o 00:02:47.414 LIB libspdk_blob.a 00:02:47.673 CC lib/lvol/lvol.o 00:02:47.673 CC lib/blobfs/blobfs.o 00:02:47.673 CC lib/blobfs/tree.o 00:02:48.241 LIB libspdk_lvol.a 00:02:48.241 LIB libspdk_blobfs.a 00:02:48.500 LIB libspdk_bdev.a 00:02:48.760 CC lib/nbd/nbd.o 00:02:48.760 CC lib/nbd/nbd_rpc.o 00:02:49.020 CC lib/ftl/ftl_init.o 00:02:49.020 CC lib/ftl/ftl_core.o 00:02:49.020 CC lib/ftl/ftl_layout.o 00:02:49.020 CC lib/ftl/ftl_debug.o 00:02:49.020 CC lib/ftl/ftl_sb.o 00:02:49.020 CC lib/ftl/ftl_io.o 00:02:49.020 CC lib/ftl/ftl_l2p.o 00:02:49.020 CC lib/ftl/ftl_l2p_flat.o 00:02:49.020 CC lib/ftl/ftl_writer.o 00:02:49.020 CC lib/ftl/ftl_nv_cache.o 00:02:49.020 CC lib/nvmf/ctrlr.o 00:02:49.020 CC lib/ftl/ftl_band.o 00:02:49.020 CC lib/ftl/ftl_band_ops.o 00:02:49.020 CC lib/nvmf/ctrlr_discovery.o 00:02:49.020 CC lib/ublk/ublk.o 00:02:49.020 CC lib/nvmf/ctrlr_bdev.o 00:02:49.020 CC lib/ublk/ublk_rpc.o 00:02:49.020 CC lib/ftl/ftl_rq.o 00:02:49.020 CC lib/ftl/ftl_l2p_cache.o 00:02:49.020 CC lib/nvmf/subsystem.o 00:02:49.020 CC lib/nvmf/nvmf.o 00:02:49.020 CC lib/ftl/ftl_reloc.o 00:02:49.020 CC lib/nvmf/nvmf_rpc.o 00:02:49.020 CC lib/scsi/dev.o 00:02:49.020 CC lib/ftl/ftl_p2l.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt.o 00:02:49.020 CC lib/scsi/lun.o 00:02:49.020 CC lib/nvmf/transport.o 00:02:49.020 CC lib/scsi/port.o 00:02:49.020 CC lib/nvmf/tcp.o 00:02:49.020 CC lib/scsi/scsi.o 00:02:49.020 CC lib/nvmf/vfio_user.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:49.020 CC lib/scsi/scsi_pr.o 00:02:49.020 CC lib/scsi/scsi_bdev.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:49.020 CC lib/nvmf/rdma.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:49.020 CC lib/scsi/scsi_rpc.o 00:02:49.020 CC lib/scsi/task.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:49.020 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:49.020 CC lib/ftl/utils/ftl_conf.o 00:02:49.020 CC lib/ftl/utils/ftl_md.o 00:02:49.020 CC lib/ftl/utils/ftl_mempool.o 00:02:49.020 CC lib/ftl/utils/ftl_bitmap.o 00:02:49.020 CC lib/ftl/utils/ftl_property.o 00:02:49.020 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:49.020 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:49.020 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:49.020 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:49.020 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:49.020 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:49.020 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:49.020 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:49.020 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:49.020 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:49.020 CC lib/ftl/base/ftl_base_dev.o 00:02:49.020 CC lib/ftl/base/ftl_base_bdev.o 00:02:49.020 CC lib/ftl/ftl_trace.o 00:02:49.279 LIB libspdk_nbd.a 00:02:49.279 LIB libspdk_scsi.a 00:02:49.279 LIB libspdk_ublk.a 00:02:49.538 LIB libspdk_ftl.a 00:02:49.538 CC lib/vhost/vhost.o 00:02:49.538 CC lib/vhost/vhost_rpc.o 00:02:49.538 CC lib/vhost/vhost_blk.o 00:02:49.538 CC lib/vhost/vhost_scsi.o 00:02:49.538 CC lib/vhost/rte_vhost_user.o 00:02:49.538 CC lib/iscsi/conn.o 00:02:49.538 CC lib/iscsi/init_grp.o 00:02:49.538 CC lib/iscsi/iscsi.o 00:02:49.538 CC lib/iscsi/md5.o 00:02:49.538 CC lib/iscsi/param.o 00:02:49.538 CC lib/iscsi/portal_grp.o 00:02:49.538 CC lib/iscsi/iscsi_rpc.o 00:02:49.538 CC lib/iscsi/tgt_node.o 00:02:49.538 CC lib/iscsi/task.o 00:02:49.538 CC lib/iscsi/iscsi_subsystem.o 00:02:50.107 LIB libspdk_nvmf.a 00:02:50.107 LIB libspdk_vhost.a 00:02:50.366 LIB libspdk_iscsi.a 00:02:50.935 CC module/env_dpdk/env_dpdk_rpc.o 00:02:50.935 CC module/vfu_device/vfu_virtio_blk.o 00:02:50.935 CC module/vfu_device/vfu_virtio_scsi.o 00:02:50.935 CC module/vfu_device/vfu_virtio.o 00:02:50.935 CC module/vfu_device/vfu_virtio_rpc.o 00:02:50.935 CC module/scheduler/gscheduler/gscheduler.o 00:02:50.935 CC module/accel/iaa/accel_iaa.o 00:02:50.935 CC module/accel/iaa/accel_iaa_rpc.o 00:02:50.935 CC module/accel/dsa/accel_dsa_rpc.o 00:02:50.935 CC module/accel/dsa/accel_dsa.o 00:02:50.935 LIB libspdk_env_dpdk_rpc.a 00:02:50.935 CC module/blob/bdev/blob_bdev.o 00:02:50.935 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:50.935 CC module/accel/ioat/accel_ioat.o 00:02:50.935 CC module/accel/ioat/accel_ioat_rpc.o 00:02:50.935 CC module/accel/error/accel_error.o 00:02:50.935 CC module/accel/error/accel_error_rpc.o 00:02:50.935 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:50.935 CC module/sock/posix/posix.o 00:02:51.194 LIB libspdk_scheduler_gscheduler.a 00:02:51.194 LIB libspdk_scheduler_dpdk_governor.a 00:02:51.194 LIB libspdk_accel_iaa.a 00:02:51.194 LIB libspdk_accel_error.a 00:02:51.194 LIB libspdk_accel_ioat.a 00:02:51.194 LIB libspdk_scheduler_dynamic.a 00:02:51.194 LIB libspdk_accel_dsa.a 00:02:51.194 LIB libspdk_blob_bdev.a 00:02:51.194 LIB libspdk_vfu_device.a 00:02:51.452 LIB libspdk_sock_posix.a 00:02:51.452 CC module/bdev/malloc/bdev_malloc.o 00:02:51.452 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:51.452 CC module/blobfs/bdev/blobfs_bdev.o 00:02:51.452 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:51.452 CC module/bdev/gpt/gpt.o 00:02:51.452 CC module/bdev/gpt/vbdev_gpt.o 00:02:51.452 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:51.452 CC module/bdev/delay/vbdev_delay.o 00:02:51.452 CC module/bdev/iscsi/bdev_iscsi.o 00:02:51.452 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:51.452 CC module/bdev/nvme/bdev_nvme.o 00:02:51.710 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:51.710 CC module/bdev/nvme/nvme_rpc.o 00:02:51.710 CC module/bdev/nvme/bdev_mdns_client.o 00:02:51.710 CC module/bdev/lvol/vbdev_lvol.o 00:02:51.710 CC module/bdev/nvme/vbdev_opal.o 00:02:51.710 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:51.710 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:51.710 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:51.710 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:51.710 CC module/bdev/passthru/vbdev_passthru.o 00:02:51.710 CC module/bdev/null/bdev_null.o 00:02:51.710 CC module/bdev/error/vbdev_error.o 00:02:51.710 CC module/bdev/null/bdev_null_rpc.o 00:02:51.710 CC module/bdev/error/vbdev_error_rpc.o 00:02:51.710 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:51.710 CC module/bdev/ftl/bdev_ftl.o 00:02:51.710 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:51.710 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:51.710 CC module/bdev/raid/bdev_raid.o 00:02:51.710 CC module/bdev/aio/bdev_aio.o 00:02:51.710 CC module/bdev/split/vbdev_split.o 00:02:51.710 CC module/bdev/raid/bdev_raid_rpc.o 00:02:51.710 CC module/bdev/aio/bdev_aio_rpc.o 00:02:51.710 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:51.710 CC module/bdev/split/vbdev_split_rpc.o 00:02:51.710 CC module/bdev/raid/bdev_raid_sb.o 00:02:51.710 CC module/bdev/raid/raid0.o 00:02:51.710 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:51.710 CC module/bdev/raid/raid1.o 00:02:51.710 CC module/bdev/raid/concat.o 00:02:51.710 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:51.710 LIB libspdk_blobfs_bdev.a 00:02:51.710 LIB libspdk_bdev_split.a 00:02:51.710 LIB libspdk_bdev_gpt.a 00:02:51.710 LIB libspdk_bdev_error.a 00:02:51.710 LIB libspdk_bdev_null.a 00:02:51.710 LIB libspdk_bdev_malloc.a 00:02:51.710 LIB libspdk_bdev_passthru.a 00:02:51.710 LIB libspdk_bdev_ftl.a 00:02:51.710 LIB libspdk_bdev_iscsi.a 00:02:51.710 LIB libspdk_bdev_aio.a 00:02:51.710 LIB libspdk_bdev_delay.a 00:02:51.969 LIB libspdk_bdev_zone_block.a 00:02:51.969 LIB libspdk_bdev_lvol.a 00:02:51.969 LIB libspdk_bdev_virtio.a 00:02:52.227 LIB libspdk_bdev_raid.a 00:02:52.795 LIB libspdk_bdev_nvme.a 00:02:53.361 CC module/event/subsystems/vmd/vmd.o 00:02:53.361 CC module/event/subsystems/scheduler/scheduler.o 00:02:53.361 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:53.361 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:53.361 CC module/event/subsystems/iobuf/iobuf.o 00:02:53.361 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:53.361 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:53.361 CC module/event/subsystems/sock/sock.o 00:02:53.361 LIB libspdk_event_scheduler.a 00:02:53.617 LIB libspdk_event_vmd.a 00:02:53.617 LIB libspdk_event_vfu_tgt.a 00:02:53.617 LIB libspdk_event_sock.a 00:02:53.617 LIB libspdk_event_vhost_blk.a 00:02:53.617 LIB libspdk_event_iobuf.a 00:02:53.875 CC module/event/subsystems/accel/accel.o 00:02:53.875 LIB libspdk_event_accel.a 00:02:54.133 CC module/event/subsystems/bdev/bdev.o 00:02:54.391 LIB libspdk_event_bdev.a 00:02:54.650 CC module/event/subsystems/scsi/scsi.o 00:02:54.650 CC module/event/subsystems/nbd/nbd.o 00:02:54.650 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:54.650 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:54.650 CC module/event/subsystems/ublk/ublk.o 00:02:54.650 LIB libspdk_event_nbd.a 00:02:54.650 LIB libspdk_event_scsi.a 00:02:54.650 LIB libspdk_event_ublk.a 00:02:54.909 LIB libspdk_event_nvmf.a 00:02:55.167 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:55.167 CC module/event/subsystems/iscsi/iscsi.o 00:02:55.167 LIB libspdk_event_vhost_scsi.a 00:02:55.167 LIB libspdk_event_iscsi.a 00:02:55.426 CC test/rpc_client/rpc_client_test.o 00:02:55.426 TEST_HEADER include/spdk/accel.h 00:02:55.426 TEST_HEADER include/spdk/accel_module.h 00:02:55.427 TEST_HEADER include/spdk/assert.h 00:02:55.427 TEST_HEADER include/spdk/barrier.h 00:02:55.427 TEST_HEADER include/spdk/base64.h 00:02:55.427 TEST_HEADER include/spdk/bdev.h 00:02:55.427 TEST_HEADER include/spdk/bdev_module.h 00:02:55.427 TEST_HEADER include/spdk/bdev_zone.h 00:02:55.427 TEST_HEADER include/spdk/bit_pool.h 00:02:55.427 TEST_HEADER include/spdk/bit_array.h 00:02:55.427 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:55.427 TEST_HEADER include/spdk/blobfs.h 00:02:55.427 TEST_HEADER include/spdk/blob_bdev.h 00:02:55.427 TEST_HEADER include/spdk/blob.h 00:02:55.427 TEST_HEADER include/spdk/config.h 00:02:55.427 TEST_HEADER include/spdk/conf.h 00:02:55.427 TEST_HEADER include/spdk/cpuset.h 00:02:55.427 TEST_HEADER include/spdk/crc16.h 00:02:55.427 TEST_HEADER include/spdk/crc32.h 00:02:55.427 TEST_HEADER include/spdk/crc64.h 00:02:55.427 TEST_HEADER include/spdk/dma.h 00:02:55.427 TEST_HEADER include/spdk/dif.h 00:02:55.427 CC app/spdk_nvme_perf/perf.o 00:02:55.427 CC app/spdk_nvme_discover/discovery_aer.o 00:02:55.427 TEST_HEADER include/spdk/endian.h 00:02:55.427 TEST_HEADER include/spdk/env_dpdk.h 00:02:55.427 TEST_HEADER include/spdk/event.h 00:02:55.427 TEST_HEADER include/spdk/fd_group.h 00:02:55.427 TEST_HEADER include/spdk/env.h 00:02:55.427 TEST_HEADER include/spdk/file.h 00:02:55.427 TEST_HEADER include/spdk/fd.h 00:02:55.427 TEST_HEADER include/spdk/ftl.h 00:02:55.427 TEST_HEADER include/spdk/gpt_spec.h 00:02:55.427 TEST_HEADER include/spdk/hexlify.h 00:02:55.427 TEST_HEADER include/spdk/histogram_data.h 00:02:55.427 TEST_HEADER include/spdk/idxd_spec.h 00:02:55.427 TEST_HEADER include/spdk/idxd.h 00:02:55.427 TEST_HEADER include/spdk/init.h 00:02:55.427 TEST_HEADER include/spdk/ioat.h 00:02:55.427 TEST_HEADER include/spdk/ioat_spec.h 00:02:55.427 TEST_HEADER include/spdk/iscsi_spec.h 00:02:55.427 TEST_HEADER include/spdk/json.h 00:02:55.427 TEST_HEADER include/spdk/jsonrpc.h 00:02:55.427 TEST_HEADER include/spdk/likely.h 00:02:55.427 CC app/spdk_top/spdk_top.o 00:02:55.427 TEST_HEADER include/spdk/log.h 00:02:55.427 TEST_HEADER include/spdk/lvol.h 00:02:55.427 TEST_HEADER include/spdk/memory.h 00:02:55.427 TEST_HEADER include/spdk/mmio.h 00:02:55.427 CC app/trace_record/trace_record.o 00:02:55.427 CXX app/trace/trace.o 00:02:55.427 TEST_HEADER include/spdk/nbd.h 00:02:55.427 TEST_HEADER include/spdk/nvme.h 00:02:55.427 TEST_HEADER include/spdk/notify.h 00:02:55.427 TEST_HEADER include/spdk/nvme_intel.h 00:02:55.427 CC app/spdk_lspci/spdk_lspci.o 00:02:55.427 CC app/spdk_nvme_identify/identify.o 00:02:55.427 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:55.427 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:55.427 CC app/iscsi_tgt/iscsi_tgt.o 00:02:55.427 TEST_HEADER include/spdk/nvme_spec.h 00:02:55.427 TEST_HEADER include/spdk/nvme_zns.h 00:02:55.427 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:55.427 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:55.427 TEST_HEADER include/spdk/nvmf_spec.h 00:02:55.427 TEST_HEADER include/spdk/nvmf.h 00:02:55.427 TEST_HEADER include/spdk/nvmf_transport.h 00:02:55.427 TEST_HEADER include/spdk/opal_spec.h 00:02:55.427 TEST_HEADER include/spdk/opal.h 00:02:55.427 TEST_HEADER include/spdk/pci_ids.h 00:02:55.427 TEST_HEADER include/spdk/pipe.h 00:02:55.427 TEST_HEADER include/spdk/queue.h 00:02:55.427 TEST_HEADER include/spdk/reduce.h 00:02:55.427 TEST_HEADER include/spdk/rpc.h 00:02:55.427 TEST_HEADER include/spdk/scheduler.h 00:02:55.427 TEST_HEADER include/spdk/scsi.h 00:02:55.427 TEST_HEADER include/spdk/scsi_spec.h 00:02:55.427 TEST_HEADER include/spdk/sock.h 00:02:55.427 TEST_HEADER include/spdk/stdinc.h 00:02:55.427 TEST_HEADER include/spdk/string.h 00:02:55.427 TEST_HEADER include/spdk/thread.h 00:02:55.427 TEST_HEADER include/spdk/trace.h 00:02:55.427 TEST_HEADER include/spdk/trace_parser.h 00:02:55.427 TEST_HEADER include/spdk/tree.h 00:02:55.427 TEST_HEADER include/spdk/ublk.h 00:02:55.427 TEST_HEADER include/spdk/util.h 00:02:55.427 TEST_HEADER include/spdk/version.h 00:02:55.427 TEST_HEADER include/spdk/uuid.h 00:02:55.427 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:55.427 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:55.427 TEST_HEADER include/spdk/vhost.h 00:02:55.427 TEST_HEADER include/spdk/xor.h 00:02:55.427 TEST_HEADER include/spdk/vmd.h 00:02:55.427 TEST_HEADER include/spdk/zipf.h 00:02:55.427 CXX test/cpp_headers/accel.o 00:02:55.688 CXX test/cpp_headers/accel_module.o 00:02:55.688 CXX test/cpp_headers/assert.o 00:02:55.688 CXX test/cpp_headers/barrier.o 00:02:55.688 CXX test/cpp_headers/base64.o 00:02:55.688 CXX test/cpp_headers/bdev.o 00:02:55.688 CXX test/cpp_headers/bdev_module.o 00:02:55.688 CXX test/cpp_headers/bit_array.o 00:02:55.688 CXX test/cpp_headers/bdev_zone.o 00:02:55.688 CXX test/cpp_headers/bit_pool.o 00:02:55.688 CXX test/cpp_headers/blob_bdev.o 00:02:55.688 CXX test/cpp_headers/blobfs_bdev.o 00:02:55.688 CXX test/cpp_headers/blobfs.o 00:02:55.688 CXX test/cpp_headers/blob.o 00:02:55.688 CXX test/cpp_headers/conf.o 00:02:55.688 CC app/nvmf_tgt/nvmf_main.o 00:02:55.688 CXX test/cpp_headers/config.o 00:02:55.688 CXX test/cpp_headers/cpuset.o 00:02:55.688 CXX test/cpp_headers/crc16.o 00:02:55.688 CXX test/cpp_headers/crc32.o 00:02:55.688 CXX test/cpp_headers/crc64.o 00:02:55.688 CXX test/cpp_headers/dif.o 00:02:55.688 CXX test/cpp_headers/dma.o 00:02:55.688 CXX test/cpp_headers/endian.o 00:02:55.688 CXX test/cpp_headers/env_dpdk.o 00:02:55.688 CXX test/cpp_headers/env.o 00:02:55.688 CC app/spdk_dd/spdk_dd.o 00:02:55.688 CXX test/cpp_headers/event.o 00:02:55.688 CXX test/cpp_headers/fd_group.o 00:02:55.688 CXX test/cpp_headers/fd.o 00:02:55.688 CXX test/cpp_headers/file.o 00:02:55.688 CXX test/cpp_headers/ftl.o 00:02:55.688 CXX test/cpp_headers/hexlify.o 00:02:55.688 CXX test/cpp_headers/gpt_spec.o 00:02:55.688 CXX test/cpp_headers/histogram_data.o 00:02:55.688 CXX test/cpp_headers/idxd.o 00:02:55.688 CXX test/cpp_headers/idxd_spec.o 00:02:55.688 CXX test/cpp_headers/init.o 00:02:55.688 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:55.688 CC app/vhost/vhost.o 00:02:55.688 CC app/spdk_tgt/spdk_tgt.o 00:02:55.688 CC test/thread/lock/spdk_lock.o 00:02:55.688 CC test/thread/poller_perf/poller_perf.o 00:02:55.688 CC test/nvme/reset/reset.o 00:02:55.688 CC test/app/jsoncat/jsoncat.o 00:02:55.688 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:55.688 CC test/env/vtophys/vtophys.o 00:02:55.688 CC test/env/memory/memory_ut.o 00:02:55.688 CC test/nvme/err_injection/err_injection.o 00:02:55.688 CC test/event/reactor/reactor.o 00:02:55.688 CC test/nvme/connect_stress/connect_stress.o 00:02:55.688 CC test/event/event_perf/event_perf.o 00:02:55.688 CC test/app/stub/stub.o 00:02:55.688 CC test/env/pci/pci_ut.o 00:02:55.688 CC test/nvme/boot_partition/boot_partition.o 00:02:55.688 CC test/nvme/compliance/nvme_compliance.o 00:02:55.688 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:55.688 CC test/nvme/reserve/reserve.o 00:02:55.688 CXX test/cpp_headers/ioat.o 00:02:55.688 CC test/nvme/aer/aer.o 00:02:55.688 CC test/nvme/fdp/fdp.o 00:02:55.688 CC test/nvme/simple_copy/simple_copy.o 00:02:55.688 CC test/nvme/sgl/sgl.o 00:02:55.688 CC test/app/histogram_perf/histogram_perf.o 00:02:55.688 CC test/nvme/overhead/overhead.o 00:02:55.688 CC test/nvme/e2edp/nvme_dp.o 00:02:55.688 CC test/dma/test_dma/test_dma.o 00:02:55.688 CC test/event/reactor_perf/reactor_perf.o 00:02:55.688 CC test/nvme/cuse/cuse.o 00:02:55.688 CC test/bdev/bdevio/bdevio.o 00:02:55.688 CC test/blobfs/mkfs/mkfs.o 00:02:55.688 CC test/nvme/fused_ordering/fused_ordering.o 00:02:55.688 CC test/event/app_repeat/app_repeat.o 00:02:55.688 CC test/nvme/startup/startup.o 00:02:55.688 CC examples/idxd/perf/perf.o 00:02:55.688 CC examples/nvme/reconnect/reconnect.o 00:02:55.688 CC examples/ioat/perf/perf.o 00:02:55.688 CC examples/sock/hello_world/hello_sock.o 00:02:55.688 CC examples/vmd/lsvmd/lsvmd.o 00:02:55.688 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:55.688 CC examples/nvme/arbitration/arbitration.o 00:02:55.688 CC examples/nvme/hello_world/hello_world.o 00:02:55.688 CC examples/accel/perf/accel_perf.o 00:02:55.688 CC examples/nvme/abort/abort.o 00:02:55.688 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:55.688 CC examples/nvme/hotplug/hotplug.o 00:02:55.688 CC examples/ioat/verify/verify.o 00:02:55.688 CC examples/vmd/led/led.o 00:02:55.688 CC app/fio/nvme/fio_plugin.o 00:02:55.688 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:55.688 CC examples/util/zipf/zipf.o 00:02:55.688 CC test/event/scheduler/scheduler.o 00:02:55.688 CC test/accel/dif/dif.o 00:02:55.688 CC test/app/bdev_svc/bdev_svc.o 00:02:55.688 LINK rpc_client_test 00:02:55.688 LINK spdk_lspci 00:02:55.688 CC examples/bdev/hello_world/hello_bdev.o 00:02:55.688 CC examples/nvmf/nvmf/nvmf.o 00:02:55.688 CC examples/thread/thread/thread_ex.o 00:02:55.688 CC examples/blob/cli/blobcli.o 00:02:55.688 CC app/fio/bdev/fio_plugin.o 00:02:55.688 CC test/lvol/esnap/esnap.o 00:02:55.688 CC examples/blob/hello_world/hello_blob.o 00:02:55.688 CC examples/bdev/bdevperf/bdevperf.o 00:02:55.688 CC test/env/mem_callbacks/mem_callbacks.o 00:02:55.688 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:55.688 LINK spdk_nvme_discover 00:02:55.688 CXX test/cpp_headers/ioat_spec.o 00:02:55.688 CXX test/cpp_headers/iscsi_spec.o 00:02:55.967 CXX test/cpp_headers/json.o 00:02:55.967 CXX test/cpp_headers/jsonrpc.o 00:02:55.967 CXX test/cpp_headers/likely.o 00:02:55.967 CXX test/cpp_headers/log.o 00:02:55.967 CXX test/cpp_headers/lvol.o 00:02:55.967 CXX test/cpp_headers/memory.o 00:02:55.967 CXX test/cpp_headers/mmio.o 00:02:55.967 CXX test/cpp_headers/nbd.o 00:02:55.967 CXX test/cpp_headers/notify.o 00:02:55.967 LINK jsoncat 00:02:55.967 CXX test/cpp_headers/nvme.o 00:02:55.967 LINK poller_perf 00:02:55.967 CXX test/cpp_headers/nvme_intel.o 00:02:55.967 CXX test/cpp_headers/nvme_ocssd.o 00:02:55.967 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:55.967 CXX test/cpp_headers/nvme_spec.o 00:02:55.967 CXX test/cpp_headers/nvme_zns.o 00:02:55.967 LINK vtophys 00:02:55.967 CXX test/cpp_headers/nvmf_cmd.o 00:02:55.967 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:55.967 LINK spdk_trace_record 00:02:55.967 CXX test/cpp_headers/nvmf.o 00:02:55.967 LINK iscsi_tgt 00:02:55.967 CXX test/cpp_headers/nvmf_spec.o 00:02:55.967 CXX test/cpp_headers/nvmf_transport.o 00:02:55.967 LINK event_perf 00:02:55.967 CXX test/cpp_headers/opal.o 00:02:55.967 CXX test/cpp_headers/opal_spec.o 00:02:55.967 LINK nvmf_tgt 00:02:55.967 CXX test/cpp_headers/pci_ids.o 00:02:55.967 CXX test/cpp_headers/pipe.o 00:02:55.967 LINK reactor 00:02:55.967 CXX test/cpp_headers/queue.o 00:02:55.967 LINK histogram_perf 00:02:55.967 CXX test/cpp_headers/reduce.o 00:02:55.967 LINK lsvmd 00:02:55.967 LINK interrupt_tgt 00:02:55.967 LINK reactor_perf 00:02:55.967 LINK env_dpdk_post_init 00:02:55.967 CXX test/cpp_headers/rpc.o 00:02:55.967 CXX test/cpp_headers/scheduler.o 00:02:55.967 LINK vhost 00:02:55.967 LINK app_repeat 00:02:55.967 LINK boot_partition 00:02:55.967 CXX test/cpp_headers/scsi.o 00:02:55.967 CXX test/cpp_headers/scsi_spec.o 00:02:55.967 LINK zipf 00:02:55.967 LINK led 00:02:55.967 LINK connect_stress 00:02:55.967 LINK stub 00:02:55.967 CXX test/cpp_headers/sock.o 00:02:55.967 LINK spdk_tgt 00:02:55.967 LINK reserve 00:02:55.967 LINK err_injection 00:02:55.967 LINK doorbell_aers 00:02:55.967 LINK startup 00:02:55.967 CXX test/cpp_headers/stdinc.o 00:02:55.967 LINK pmr_persistence 00:02:55.967 LINK fused_ordering 00:02:55.967 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:55.967 struct spdk_nvme_fdp_ruhs ruhs; 00:02:55.968 ^ 00:02:55.968 LINK mkfs 00:02:55.968 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:55.968 LINK bdev_svc 00:02:55.968 LINK verify 00:02:55.968 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:55.968 LINK cmb_copy 00:02:55.968 LINK ioat_perf 00:02:55.968 LINK simple_copy 00:02:55.968 LINK hello_world 00:02:55.968 LINK hello_sock 00:02:55.968 LINK hotplug 00:02:55.968 LINK scheduler 00:02:55.968 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:55.968 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:55.968 LINK reset 00:02:55.968 LINK fdp 00:02:55.968 LINK nvme_dp 00:02:55.968 LINK sgl 00:02:55.968 CXX test/cpp_headers/string.o 00:02:55.968 LINK aer 00:02:55.968 LINK hello_bdev 00:02:55.968 CXX test/cpp_headers/thread.o 00:02:55.968 LINK overhead 00:02:55.968 LINK thread 00:02:55.968 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:55.968 LINK hello_blob 00:02:55.968 CXX test/cpp_headers/trace.o 00:02:55.968 CXX test/cpp_headers/trace_parser.o 00:02:55.968 CXX test/cpp_headers/tree.o 00:02:55.968 CXX test/cpp_headers/ublk.o 00:02:55.968 CXX test/cpp_headers/util.o 00:02:55.968 CXX test/cpp_headers/uuid.o 00:02:55.968 CXX test/cpp_headers/version.o 00:02:56.230 CXX test/cpp_headers/vfio_user_pci.o 00:02:56.230 LINK spdk_trace 00:02:56.230 CXX test/cpp_headers/vfio_user_spec.o 00:02:56.230 CXX test/cpp_headers/vhost.o 00:02:56.230 CXX test/cpp_headers/vmd.o 00:02:56.230 CXX test/cpp_headers/xor.o 00:02:56.230 CXX test/cpp_headers/zipf.o 00:02:56.230 LINK nvmf 00:02:56.230 LINK reconnect 00:02:56.230 LINK test_dma 00:02:56.230 LINK idxd_perf 00:02:56.230 LINK dif 00:02:56.230 LINK spdk_dd 00:02:56.230 LINK abort 00:02:56.230 LINK arbitration 00:02:56.230 LINK bdevio 00:02:56.230 LINK nvme_compliance 00:02:56.230 LINK accel_perf 00:02:56.230 LINK pci_ut 00:02:56.230 LINK nvme_fuzz 00:02:56.230 LINK nvme_manage 00:02:56.491 LINK llvm_vfio_fuzz 00:02:56.491 LINK mem_callbacks 00:02:56.491 1 warning generated. 00:02:56.491 LINK spdk_bdev 00:02:56.491 LINK blobcli 00:02:56.491 LINK spdk_nvme_perf 00:02:56.491 LINK spdk_nvme 00:02:56.491 LINK bdevperf 00:02:56.491 LINK vhost_fuzz 00:02:56.491 LINK memory_ut 00:02:56.749 LINK spdk_nvme_identify 00:02:56.749 LINK cuse 00:02:56.749 LINK spdk_top 00:02:57.007 LINK llvm_nvme_fuzz 00:02:57.265 LINK iscsi_fuzz 00:02:57.265 LINK spdk_lock 00:02:59.409 LINK esnap 00:02:59.668 00:02:59.668 real 0m23.921s 00:02:59.668 user 4m36.190s 00:02:59.668 sys 1m55.723s 00:02:59.668 23:49:49 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:59.668 23:49:49 -- common/autotest_common.sh@10 -- $ set +x 00:02:59.668 ************************************ 00:02:59.668 END TEST make 00:02:59.668 ************************************ 00:02:59.928 23:49:49 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:59.928 23:49:49 -- nvmf/common.sh@7 -- # uname -s 00:02:59.929 23:49:49 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:59.929 23:49:49 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:59.929 23:49:49 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:59.929 23:49:49 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:59.929 23:49:49 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:59.929 23:49:49 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:59.929 23:49:49 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:59.929 23:49:49 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:59.929 23:49:49 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:59.929 23:49:49 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:59.929 23:49:49 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:59.929 23:49:49 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:59.929 23:49:49 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:59.929 23:49:49 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:59.929 23:49:49 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:59.929 23:49:49 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:59.929 23:49:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:59.929 23:49:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:59.929 23:49:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:59.929 23:49:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:59.929 23:49:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:59.929 23:49:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:59.929 23:49:49 -- paths/export.sh@5 -- # export PATH 00:02:59.929 23:49:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:59.929 23:49:49 -- nvmf/common.sh@46 -- # : 0 00:02:59.929 23:49:49 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:59.929 23:49:49 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:59.929 23:49:49 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:59.929 23:49:49 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:59.929 23:49:49 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:59.929 23:49:49 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:59.929 23:49:49 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:59.929 23:49:49 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:59.929 23:49:49 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:59.929 23:49:49 -- spdk/autotest.sh@32 -- # uname -s 00:02:59.929 23:49:49 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:59.929 23:49:49 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:59.929 23:49:49 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:59.929 23:49:49 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:59.929 23:49:49 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:59.929 23:49:49 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:59.929 23:49:49 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:59.929 23:49:49 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:59.929 23:49:49 -- spdk/autotest.sh@48 -- # udevadm_pid=401651 00:02:59.929 23:49:49 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:59.929 23:49:49 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:59.929 23:49:49 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:59.929 23:49:49 -- spdk/autotest.sh@54 -- # echo 401653 00:02:59.929 23:49:49 -- spdk/autotest.sh@56 -- # echo 401654 00:02:59.929 23:49:49 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:59.929 23:49:49 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:59.929 23:49:49 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:59.929 23:49:49 -- spdk/autotest.sh@60 -- # echo 401655 00:02:59.929 23:49:49 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:59.929 23:49:49 -- spdk/autotest.sh@62 -- # echo 401656 00:02:59.929 23:49:49 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:59.929 23:49:49 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:59.929 23:49:49 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:59.929 23:49:49 -- common/autotest_common.sh@10 -- # set +x 00:02:59.929 23:49:49 -- spdk/autotest.sh@70 -- # create_test_list 00:02:59.929 23:49:49 -- common/autotest_common.sh@736 -- # xtrace_disable 00:02:59.929 23:49:49 -- common/autotest_common.sh@10 -- # set +x 00:02:59.929 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:59.929 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:59.929 23:49:49 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:59.929 23:49:49 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:59.929 23:49:49 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:59.929 23:49:49 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:59.929 23:49:49 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:59.929 23:49:49 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:59.929 23:49:49 -- common/autotest_common.sh@1440 -- # uname 00:02:59.929 23:49:49 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:02:59.929 23:49:49 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:59.929 23:49:49 -- common/autotest_common.sh@1460 -- # uname 00:02:59.929 23:49:49 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:02:59.929 23:49:49 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:02:59.929 23:49:49 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:02:59.929 23:49:49 -- spdk/autotest.sh@83 -- # hash lcov 00:02:59.929 23:49:49 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:59.929 23:49:49 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:02:59.929 23:49:49 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:59.929 23:49:49 -- common/autotest_common.sh@10 -- # set +x 00:02:59.929 23:49:49 -- spdk/autotest.sh@102 -- # rm -f 00:02:59.929 23:49:49 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:03.220 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:03.220 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:03.220 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:03.220 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:03.220 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:03.479 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:03.737 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:03.737 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:03.737 23:49:53 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:03.737 23:49:53 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:03.737 23:49:53 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:03.737 23:49:53 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:03.737 23:49:53 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:03.737 23:49:53 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:03.737 23:49:53 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:03.737 23:49:53 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:03.737 23:49:53 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:03.737 23:49:53 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:03.737 23:49:53 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:03.737 23:49:53 -- spdk/autotest.sh@121 -- # grep -v p 00:03:03.737 23:49:53 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:03.737 23:49:53 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:03.737 23:49:53 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:03.737 23:49:53 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:03.737 23:49:53 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:03.737 No valid GPT data, bailing 00:03:03.737 23:49:53 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:03.737 23:49:53 -- scripts/common.sh@393 -- # pt= 00:03:03.737 23:49:53 -- scripts/common.sh@394 -- # return 1 00:03:03.737 23:49:53 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:03.737 1+0 records in 00:03:03.737 1+0 records out 00:03:03.737 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00216746 s, 484 MB/s 00:03:03.737 23:49:53 -- spdk/autotest.sh@129 -- # sync 00:03:03.737 23:49:53 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:03.737 23:49:53 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:03.737 23:49:53 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:10.303 23:49:59 -- spdk/autotest.sh@135 -- # uname -s 00:03:10.303 23:49:59 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:10.303 23:49:59 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:10.303 23:49:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:10.303 23:49:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:10.303 23:49:59 -- common/autotest_common.sh@10 -- # set +x 00:03:10.303 ************************************ 00:03:10.303 START TEST setup.sh 00:03:10.303 ************************************ 00:03:10.303 23:49:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:10.303 * Looking for test storage... 00:03:10.303 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:10.303 23:49:59 -- setup/test-setup.sh@10 -- # uname -s 00:03:10.303 23:49:59 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:10.303 23:49:59 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:10.303 23:49:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:10.303 23:49:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:10.303 23:49:59 -- common/autotest_common.sh@10 -- # set +x 00:03:10.303 ************************************ 00:03:10.303 START TEST acl 00:03:10.303 ************************************ 00:03:10.303 23:49:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:10.303 * Looking for test storage... 00:03:10.303 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:10.303 23:49:59 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:10.303 23:49:59 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:10.303 23:49:59 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:10.303 23:49:59 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:10.303 23:49:59 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:10.303 23:49:59 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:10.303 23:49:59 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:10.303 23:49:59 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:10.303 23:49:59 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:10.303 23:49:59 -- setup/acl.sh@12 -- # devs=() 00:03:10.303 23:49:59 -- setup/acl.sh@12 -- # declare -a devs 00:03:10.303 23:49:59 -- setup/acl.sh@13 -- # drivers=() 00:03:10.303 23:49:59 -- setup/acl.sh@13 -- # declare -A drivers 00:03:10.303 23:49:59 -- setup/acl.sh@51 -- # setup reset 00:03:10.303 23:49:59 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:10.303 23:49:59 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:14.493 23:50:03 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:14.493 23:50:03 -- setup/acl.sh@16 -- # local dev driver 00:03:14.493 23:50:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:14.493 23:50:03 -- setup/acl.sh@15 -- # setup output status 00:03:14.493 23:50:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:14.493 23:50:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:17.782 Hugepages 00:03:17.782 node hugesize free / total 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 00:03:17.782 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.782 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.782 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.782 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:06 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:17.783 23:50:06 -- setup/acl.sh@20 -- # continue 00:03:17.783 23:50:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:07 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:17.783 23:50:07 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:17.783 23:50:07 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:17.783 23:50:07 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:17.783 23:50:07 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:17.783 23:50:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:17.783 23:50:07 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:17.783 23:50:07 -- setup/acl.sh@54 -- # run_test denied denied 00:03:17.783 23:50:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:17.783 23:50:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:17.783 23:50:07 -- common/autotest_common.sh@10 -- # set +x 00:03:17.783 ************************************ 00:03:17.783 START TEST denied 00:03:17.783 ************************************ 00:03:17.783 23:50:07 -- common/autotest_common.sh@1104 -- # denied 00:03:17.783 23:50:07 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:17.783 23:50:07 -- setup/acl.sh@38 -- # setup output config 00:03:17.783 23:50:07 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:17.783 23:50:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:17.783 23:50:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:21.069 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:21.069 23:50:10 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:21.069 23:50:10 -- setup/acl.sh@28 -- # local dev driver 00:03:21.069 23:50:10 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:21.069 23:50:10 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:21.069 23:50:10 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:21.069 23:50:10 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:21.069 23:50:10 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:21.069 23:50:10 -- setup/acl.sh@41 -- # setup reset 00:03:21.069 23:50:10 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:21.069 23:50:10 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.255 00:03:25.255 real 0m7.815s 00:03:25.255 user 0m2.384s 00:03:25.255 sys 0m4.718s 00:03:25.255 23:50:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:25.255 23:50:14 -- common/autotest_common.sh@10 -- # set +x 00:03:25.255 ************************************ 00:03:25.255 END TEST denied 00:03:25.255 ************************************ 00:03:25.513 23:50:14 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:25.513 23:50:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:25.513 23:50:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:25.513 23:50:14 -- common/autotest_common.sh@10 -- # set +x 00:03:25.513 ************************************ 00:03:25.513 START TEST allowed 00:03:25.513 ************************************ 00:03:25.513 23:50:14 -- common/autotest_common.sh@1104 -- # allowed 00:03:25.513 23:50:14 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:25.513 23:50:14 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:25.513 23:50:14 -- setup/acl.sh@45 -- # setup output config 00:03:25.513 23:50:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.513 23:50:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:30.786 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:30.787 23:50:19 -- setup/acl.sh@47 -- # verify 00:03:30.787 23:50:19 -- setup/acl.sh@28 -- # local dev driver 00:03:30.787 23:50:19 -- setup/acl.sh@48 -- # setup reset 00:03:30.787 23:50:19 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:30.787 23:50:19 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:34.080 00:03:34.080 real 0m8.460s 00:03:34.080 user 0m2.432s 00:03:34.080 sys 0m4.695s 00:03:34.080 23:50:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:34.080 23:50:23 -- common/autotest_common.sh@10 -- # set +x 00:03:34.080 ************************************ 00:03:34.080 END TEST allowed 00:03:34.080 ************************************ 00:03:34.080 00:03:34.080 real 0m23.636s 00:03:34.080 user 0m7.480s 00:03:34.080 sys 0m14.396s 00:03:34.080 23:50:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:34.080 23:50:23 -- common/autotest_common.sh@10 -- # set +x 00:03:34.080 ************************************ 00:03:34.080 END TEST acl 00:03:34.080 ************************************ 00:03:34.080 23:50:23 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:34.080 23:50:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:34.080 23:50:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:34.080 23:50:23 -- common/autotest_common.sh@10 -- # set +x 00:03:34.080 ************************************ 00:03:34.080 START TEST hugepages 00:03:34.080 ************************************ 00:03:34.080 23:50:23 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:34.080 * Looking for test storage... 00:03:34.080 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:34.080 23:50:23 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:34.080 23:50:23 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:34.080 23:50:23 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:34.080 23:50:23 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:34.080 23:50:23 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:34.080 23:50:23 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:34.080 23:50:23 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:34.080 23:50:23 -- setup/common.sh@18 -- # local node= 00:03:34.080 23:50:23 -- setup/common.sh@19 -- # local var val 00:03:34.080 23:50:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:34.080 23:50:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:34.080 23:50:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:34.080 23:50:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:34.080 23:50:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:34.080 23:50:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 39402516 kB' 'MemAvailable: 43040120 kB' 'Buffers: 10504 kB' 'Cached: 12552824 kB' 'SwapCached: 0 kB' 'Active: 9661788 kB' 'Inactive: 3422776 kB' 'Active(anon): 9067680 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525232 kB' 'Mapped: 155172 kB' 'Shmem: 8546444 kB' 'KReclaimable: 235284 kB' 'Slab: 701100 kB' 'SReclaimable: 235284 kB' 'SUnreclaim: 465816 kB' 'KernelStack: 21632 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439056 kB' 'Committed_AS: 10336124 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213096 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # continue 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:34.080 23:50:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:34.080 23:50:23 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:34.080 23:50:23 -- setup/common.sh@33 -- # echo 2048 00:03:34.080 23:50:23 -- setup/common.sh@33 -- # return 0 00:03:34.080 23:50:23 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:34.080 23:50:23 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:34.080 23:50:23 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:34.080 23:50:23 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:34.080 23:50:23 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:34.080 23:50:23 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:34.080 23:50:23 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:34.080 23:50:23 -- setup/hugepages.sh@207 -- # get_nodes 00:03:34.080 23:50:23 -- setup/hugepages.sh@27 -- # local node 00:03:34.080 23:50:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:34.080 23:50:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:34.080 23:50:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:34.080 23:50:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:34.080 23:50:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:34.080 23:50:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:34.080 23:50:23 -- setup/hugepages.sh@208 -- # clear_hp 00:03:34.080 23:50:23 -- setup/hugepages.sh@37 -- # local node hp 00:03:34.080 23:50:23 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:34.080 23:50:23 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.080 23:50:23 -- setup/hugepages.sh@41 -- # echo 0 00:03:34.080 23:50:23 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.080 23:50:23 -- setup/hugepages.sh@41 -- # echo 0 00:03:34.080 23:50:23 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:34.080 23:50:23 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.080 23:50:23 -- setup/hugepages.sh@41 -- # echo 0 00:03:34.080 23:50:23 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:34.080 23:50:23 -- setup/hugepages.sh@41 -- # echo 0 00:03:34.080 23:50:23 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:34.080 23:50:23 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:34.080 23:50:23 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:34.080 23:50:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:34.080 23:50:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:34.081 23:50:23 -- common/autotest_common.sh@10 -- # set +x 00:03:34.081 ************************************ 00:03:34.081 START TEST default_setup 00:03:34.081 ************************************ 00:03:34.081 23:50:23 -- common/autotest_common.sh@1104 -- # default_setup 00:03:34.081 23:50:23 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:34.081 23:50:23 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:34.081 23:50:23 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:34.081 23:50:23 -- setup/hugepages.sh@51 -- # shift 00:03:34.081 23:50:23 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:34.081 23:50:23 -- setup/hugepages.sh@52 -- # local node_ids 00:03:34.081 23:50:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:34.081 23:50:23 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:34.081 23:50:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:34.081 23:50:23 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:34.081 23:50:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:34.081 23:50:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:34.081 23:50:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:34.081 23:50:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:34.081 23:50:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:34.081 23:50:23 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:34.081 23:50:23 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:34.081 23:50:23 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:34.081 23:50:23 -- setup/hugepages.sh@73 -- # return 0 00:03:34.081 23:50:23 -- setup/hugepages.sh@137 -- # setup output 00:03:34.081 23:50:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.081 23:50:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:37.368 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:37.368 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:37.368 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:37.627 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:39.541 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:39.541 23:50:28 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:39.541 23:50:28 -- setup/hugepages.sh@89 -- # local node 00:03:39.541 23:50:28 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.541 23:50:28 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.541 23:50:28 -- setup/hugepages.sh@92 -- # local surp 00:03:39.541 23:50:28 -- setup/hugepages.sh@93 -- # local resv 00:03:39.541 23:50:28 -- setup/hugepages.sh@94 -- # local anon 00:03:39.541 23:50:28 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.541 23:50:28 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.541 23:50:28 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.541 23:50:28 -- setup/common.sh@18 -- # local node= 00:03:39.541 23:50:28 -- setup/common.sh@19 -- # local var val 00:03:39.541 23:50:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.541 23:50:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.541 23:50:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.541 23:50:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.541 23:50:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.541 23:50:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.541 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.541 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.541 23:50:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41633008 kB' 'MemAvailable: 45270056 kB' 'Buffers: 10504 kB' 'Cached: 12552960 kB' 'SwapCached: 0 kB' 'Active: 9680280 kB' 'Inactive: 3422776 kB' 'Active(anon): 9086172 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 542988 kB' 'Mapped: 155476 kB' 'Shmem: 8546580 kB' 'KReclaimable: 234172 kB' 'Slab: 698568 kB' 'SReclaimable: 234172 kB' 'SUnreclaim: 464396 kB' 'KernelStack: 21664 kB' 'PageTables: 8012 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10357164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213080 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:39.541 23:50:28 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.541 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.541 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.541 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.541 23:50:28 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.541 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.542 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.542 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.543 23:50:28 -- setup/common.sh@33 -- # echo 0 00:03:39.543 23:50:28 -- setup/common.sh@33 -- # return 0 00:03:39.543 23:50:28 -- setup/hugepages.sh@97 -- # anon=0 00:03:39.543 23:50:28 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.543 23:50:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.543 23:50:28 -- setup/common.sh@18 -- # local node= 00:03:39.543 23:50:28 -- setup/common.sh@19 -- # local var val 00:03:39.543 23:50:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.543 23:50:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.543 23:50:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.543 23:50:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.543 23:50:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.543 23:50:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.543 23:50:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41640736 kB' 'MemAvailable: 45277784 kB' 'Buffers: 10504 kB' 'Cached: 12552964 kB' 'SwapCached: 0 kB' 'Active: 9679516 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085408 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 542200 kB' 'Mapped: 155336 kB' 'Shmem: 8546584 kB' 'KReclaimable: 234172 kB' 'Slab: 698480 kB' 'SReclaimable: 234172 kB' 'SUnreclaim: 464308 kB' 'KernelStack: 21680 kB' 'PageTables: 7692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10356928 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213032 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.543 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.543 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.544 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.544 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.545 23:50:28 -- setup/common.sh@33 -- # echo 0 00:03:39.545 23:50:28 -- setup/common.sh@33 -- # return 0 00:03:39.545 23:50:28 -- setup/hugepages.sh@99 -- # surp=0 00:03:39.545 23:50:28 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.545 23:50:28 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.545 23:50:28 -- setup/common.sh@18 -- # local node= 00:03:39.545 23:50:28 -- setup/common.sh@19 -- # local var val 00:03:39.545 23:50:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.545 23:50:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.545 23:50:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.545 23:50:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.545 23:50:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.545 23:50:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.545 23:50:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41640096 kB' 'MemAvailable: 45277144 kB' 'Buffers: 10504 kB' 'Cached: 12552976 kB' 'SwapCached: 0 kB' 'Active: 9680008 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085900 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 542560 kB' 'Mapped: 155336 kB' 'Shmem: 8546596 kB' 'KReclaimable: 234172 kB' 'Slab: 698044 kB' 'SReclaimable: 234172 kB' 'SUnreclaim: 463872 kB' 'KernelStack: 21792 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10357192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213096 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.545 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.545 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.546 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.546 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.547 23:50:28 -- setup/common.sh@33 -- # echo 0 00:03:39.547 23:50:28 -- setup/common.sh@33 -- # return 0 00:03:39.547 23:50:28 -- setup/hugepages.sh@100 -- # resv=0 00:03:39.547 23:50:28 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.547 nr_hugepages=1024 00:03:39.547 23:50:28 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.547 resv_hugepages=0 00:03:39.547 23:50:28 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.547 surplus_hugepages=0 00:03:39.547 23:50:28 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.547 anon_hugepages=0 00:03:39.547 23:50:28 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.547 23:50:28 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.547 23:50:28 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.547 23:50:28 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.547 23:50:28 -- setup/common.sh@18 -- # local node= 00:03:39.547 23:50:28 -- setup/common.sh@19 -- # local var val 00:03:39.547 23:50:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.547 23:50:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.547 23:50:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.547 23:50:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.547 23:50:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.547 23:50:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41640756 kB' 'MemAvailable: 45277804 kB' 'Buffers: 10504 kB' 'Cached: 12552988 kB' 'SwapCached: 0 kB' 'Active: 9679376 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085268 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541896 kB' 'Mapped: 155336 kB' 'Shmem: 8546608 kB' 'KReclaimable: 234172 kB' 'Slab: 698044 kB' 'SReclaimable: 234172 kB' 'SUnreclaim: 463872 kB' 'KernelStack: 21776 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10357208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213064 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.547 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.547 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.548 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.548 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.549 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.549 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.549 23:50:28 -- setup/common.sh@33 -- # echo 1024 00:03:39.549 23:50:28 -- setup/common.sh@33 -- # return 0 00:03:39.549 23:50:28 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.549 23:50:28 -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.549 23:50:28 -- setup/hugepages.sh@27 -- # local node 00:03:39.549 23:50:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.549 23:50:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:39.549 23:50:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.549 23:50:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:39.549 23:50:28 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.549 23:50:28 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.550 23:50:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.550 23:50:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.550 23:50:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.550 23:50:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.550 23:50:28 -- setup/common.sh@18 -- # local node=0 00:03:39.550 23:50:28 -- setup/common.sh@19 -- # local var val 00:03:39.550 23:50:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.550 23:50:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.550 23:50:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.550 23:50:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.550 23:50:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.550 23:50:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24848016 kB' 'MemUsed: 7744068 kB' 'SwapCached: 0 kB' 'Active: 3586376 kB' 'Inactive: 110720 kB' 'Active(anon): 3321636 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3460512 kB' 'Mapped: 104532 kB' 'AnonPages: 239748 kB' 'Shmem: 3085052 kB' 'KernelStack: 11304 kB' 'PageTables: 4096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111592 kB' 'Slab: 338076 kB' 'SReclaimable: 111592 kB' 'SUnreclaim: 226484 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.550 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.550 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # continue 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.551 23:50:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.551 23:50:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.551 23:50:28 -- setup/common.sh@33 -- # echo 0 00:03:39.551 23:50:28 -- setup/common.sh@33 -- # return 0 00:03:39.551 23:50:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.551 23:50:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.551 23:50:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.551 23:50:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.551 23:50:28 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:39.551 node0=1024 expecting 1024 00:03:39.551 23:50:28 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:39.551 00:03:39.551 real 0m5.348s 00:03:39.551 user 0m1.392s 00:03:39.551 sys 0m2.452s 00:03:39.551 23:50:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:39.551 23:50:28 -- common/autotest_common.sh@10 -- # set +x 00:03:39.551 ************************************ 00:03:39.551 END TEST default_setup 00:03:39.551 ************************************ 00:03:39.551 23:50:28 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:39.551 23:50:28 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:39.551 23:50:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:39.551 23:50:28 -- common/autotest_common.sh@10 -- # set +x 00:03:39.551 ************************************ 00:03:39.551 START TEST per_node_1G_alloc 00:03:39.551 ************************************ 00:03:39.551 23:50:28 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:39.551 23:50:28 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:39.551 23:50:28 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:39.551 23:50:28 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:39.551 23:50:28 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:39.551 23:50:28 -- setup/hugepages.sh@51 -- # shift 00:03:39.551 23:50:28 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:39.551 23:50:28 -- setup/hugepages.sh@52 -- # local node_ids 00:03:39.551 23:50:28 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:39.551 23:50:28 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:39.551 23:50:28 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:39.551 23:50:28 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:39.551 23:50:28 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:39.551 23:50:28 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:39.551 23:50:28 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:39.551 23:50:28 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:39.551 23:50:28 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:39.551 23:50:28 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:39.551 23:50:28 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:39.551 23:50:28 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:39.551 23:50:28 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:39.551 23:50:28 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:39.551 23:50:28 -- setup/hugepages.sh@73 -- # return 0 00:03:39.551 23:50:28 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:39.551 23:50:28 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:39.551 23:50:28 -- setup/hugepages.sh@146 -- # setup output 00:03:39.551 23:50:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.551 23:50:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:42.851 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.851 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:42.851 23:50:32 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:42.851 23:50:32 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:42.851 23:50:32 -- setup/hugepages.sh@89 -- # local node 00:03:42.851 23:50:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:42.851 23:50:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:42.851 23:50:32 -- setup/hugepages.sh@92 -- # local surp 00:03:42.851 23:50:32 -- setup/hugepages.sh@93 -- # local resv 00:03:42.851 23:50:32 -- setup/hugepages.sh@94 -- # local anon 00:03:42.851 23:50:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:42.851 23:50:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:42.851 23:50:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:42.851 23:50:32 -- setup/common.sh@18 -- # local node= 00:03:42.851 23:50:32 -- setup/common.sh@19 -- # local var val 00:03:42.851 23:50:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.851 23:50:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.851 23:50:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.851 23:50:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.851 23:50:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.851 23:50:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.851 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.851 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41683700 kB' 'MemAvailable: 45320712 kB' 'Buffers: 10504 kB' 'Cached: 12553076 kB' 'SwapCached: 0 kB' 'Active: 9679868 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085760 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541820 kB' 'Mapped: 155444 kB' 'Shmem: 8546696 kB' 'KReclaimable: 234100 kB' 'Slab: 697896 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463796 kB' 'KernelStack: 21648 kB' 'PageTables: 7732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10353800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213288 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.852 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.852 23:50:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.853 23:50:32 -- setup/common.sh@33 -- # echo 0 00:03:42.853 23:50:32 -- setup/common.sh@33 -- # return 0 00:03:42.853 23:50:32 -- setup/hugepages.sh@97 -- # anon=0 00:03:42.853 23:50:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:42.853 23:50:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.853 23:50:32 -- setup/common.sh@18 -- # local node= 00:03:42.853 23:50:32 -- setup/common.sh@19 -- # local var val 00:03:42.853 23:50:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.853 23:50:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.853 23:50:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.853 23:50:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.853 23:50:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.853 23:50:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41684484 kB' 'MemAvailable: 45321496 kB' 'Buffers: 10504 kB' 'Cached: 12553076 kB' 'SwapCached: 0 kB' 'Active: 9679596 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085488 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541620 kB' 'Mapped: 155416 kB' 'Shmem: 8546696 kB' 'KReclaimable: 234100 kB' 'Slab: 697928 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463828 kB' 'KernelStack: 21632 kB' 'PageTables: 7672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10353812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213256 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.853 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.853 23:50:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.854 23:50:32 -- setup/common.sh@33 -- # echo 0 00:03:42.854 23:50:32 -- setup/common.sh@33 -- # return 0 00:03:42.854 23:50:32 -- setup/hugepages.sh@99 -- # surp=0 00:03:42.854 23:50:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:42.854 23:50:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:42.854 23:50:32 -- setup/common.sh@18 -- # local node= 00:03:42.854 23:50:32 -- setup/common.sh@19 -- # local var val 00:03:42.854 23:50:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.854 23:50:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.854 23:50:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.854 23:50:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.854 23:50:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.854 23:50:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41685216 kB' 'MemAvailable: 45322228 kB' 'Buffers: 10504 kB' 'Cached: 12553088 kB' 'SwapCached: 0 kB' 'Active: 9679116 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085008 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541584 kB' 'Mapped: 155340 kB' 'Shmem: 8546708 kB' 'KReclaimable: 234100 kB' 'Slab: 697932 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463832 kB' 'KernelStack: 21632 kB' 'PageTables: 7672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10353824 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213256 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.854 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.854 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.855 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.855 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.855 23:50:32 -- setup/common.sh@33 -- # echo 0 00:03:42.855 23:50:32 -- setup/common.sh@33 -- # return 0 00:03:42.855 23:50:32 -- setup/hugepages.sh@100 -- # resv=0 00:03:42.855 23:50:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:42.855 nr_hugepages=1024 00:03:42.855 23:50:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:42.855 resv_hugepages=0 00:03:42.855 23:50:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:42.855 surplus_hugepages=0 00:03:42.855 23:50:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:42.855 anon_hugepages=0 00:03:42.855 23:50:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:42.856 23:50:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:42.856 23:50:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:42.856 23:50:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:42.856 23:50:32 -- setup/common.sh@18 -- # local node= 00:03:42.856 23:50:32 -- setup/common.sh@19 -- # local var val 00:03:42.856 23:50:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.856 23:50:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.856 23:50:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.856 23:50:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.856 23:50:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.856 23:50:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41690808 kB' 'MemAvailable: 45327820 kB' 'Buffers: 10504 kB' 'Cached: 12553104 kB' 'SwapCached: 0 kB' 'Active: 9679356 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085248 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541880 kB' 'Mapped: 155340 kB' 'Shmem: 8546724 kB' 'KReclaimable: 234100 kB' 'Slab: 697916 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463816 kB' 'KernelStack: 21664 kB' 'PageTables: 7788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10353840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213240 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.856 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.856 23:50:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.857 23:50:32 -- setup/common.sh@33 -- # echo 1024 00:03:42.857 23:50:32 -- setup/common.sh@33 -- # return 0 00:03:42.857 23:50:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:42.857 23:50:32 -- setup/hugepages.sh@112 -- # get_nodes 00:03:42.857 23:50:32 -- setup/hugepages.sh@27 -- # local node 00:03:42.857 23:50:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.857 23:50:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.857 23:50:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.857 23:50:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.857 23:50:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:42.857 23:50:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:42.857 23:50:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.857 23:50:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.857 23:50:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:42.857 23:50:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.857 23:50:32 -- setup/common.sh@18 -- # local node=0 00:03:42.857 23:50:32 -- setup/common.sh@19 -- # local var val 00:03:42.857 23:50:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.857 23:50:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.857 23:50:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:42.857 23:50:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:42.857 23:50:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.857 23:50:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25931232 kB' 'MemUsed: 6660852 kB' 'SwapCached: 0 kB' 'Active: 3586060 kB' 'Inactive: 110720 kB' 'Active(anon): 3321320 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3460604 kB' 'Mapped: 104536 kB' 'AnonPages: 239404 kB' 'Shmem: 3085144 kB' 'KernelStack: 11304 kB' 'PageTables: 4032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111584 kB' 'Slab: 337920 kB' 'SReclaimable: 111584 kB' 'SUnreclaim: 226336 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.857 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.857 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@33 -- # echo 0 00:03:42.858 23:50:32 -- setup/common.sh@33 -- # return 0 00:03:42.858 23:50:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:42.858 23:50:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.858 23:50:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.858 23:50:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:42.858 23:50:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.858 23:50:32 -- setup/common.sh@18 -- # local node=1 00:03:42.858 23:50:32 -- setup/common.sh@19 -- # local var val 00:03:42.858 23:50:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:42.858 23:50:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.858 23:50:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:42.858 23:50:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:42.858 23:50:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.858 23:50:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 15760100 kB' 'MemUsed: 11943028 kB' 'SwapCached: 0 kB' 'Active: 6092924 kB' 'Inactive: 3312056 kB' 'Active(anon): 5763556 kB' 'Inactive(anon): 0 kB' 'Active(file): 329368 kB' 'Inactive(file): 3312056 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9103004 kB' 'Mapped: 50804 kB' 'AnonPages: 302100 kB' 'Shmem: 5461580 kB' 'KernelStack: 10296 kB' 'PageTables: 3536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122516 kB' 'Slab: 359996 kB' 'SReclaimable: 122516 kB' 'SUnreclaim: 237480 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.858 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.858 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # continue 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:42.859 23:50:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:42.859 23:50:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.859 23:50:32 -- setup/common.sh@33 -- # echo 0 00:03:42.859 23:50:32 -- setup/common.sh@33 -- # return 0 00:03:42.859 23:50:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:42.859 23:50:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:42.859 23:50:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:42.859 23:50:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:42.859 23:50:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:42.859 node0=512 expecting 512 00:03:42.859 23:50:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:42.859 23:50:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:42.859 23:50:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:42.859 23:50:32 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:42.859 node1=512 expecting 512 00:03:42.859 23:50:32 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:42.859 00:03:42.859 real 0m3.447s 00:03:42.859 user 0m1.316s 00:03:42.859 sys 0m2.197s 00:03:42.859 23:50:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:42.859 23:50:32 -- common/autotest_common.sh@10 -- # set +x 00:03:42.859 ************************************ 00:03:42.859 END TEST per_node_1G_alloc 00:03:42.859 ************************************ 00:03:43.119 23:50:32 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:43.119 23:50:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:43.119 23:50:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:43.119 23:50:32 -- common/autotest_common.sh@10 -- # set +x 00:03:43.119 ************************************ 00:03:43.119 START TEST even_2G_alloc 00:03:43.119 ************************************ 00:03:43.119 23:50:32 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:03:43.119 23:50:32 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:43.119 23:50:32 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:43.119 23:50:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:43.119 23:50:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:43.119 23:50:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:43.119 23:50:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:43.119 23:50:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:43.119 23:50:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:43.119 23:50:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:43.119 23:50:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:43.119 23:50:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:43.119 23:50:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:43.119 23:50:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:43.119 23:50:32 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:43.119 23:50:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:43.119 23:50:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:43.120 23:50:32 -- setup/hugepages.sh@83 -- # : 512 00:03:43.120 23:50:32 -- setup/hugepages.sh@84 -- # : 1 00:03:43.120 23:50:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:43.120 23:50:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:43.120 23:50:32 -- setup/hugepages.sh@83 -- # : 0 00:03:43.120 23:50:32 -- setup/hugepages.sh@84 -- # : 0 00:03:43.120 23:50:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:43.120 23:50:32 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:43.120 23:50:32 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:43.120 23:50:32 -- setup/hugepages.sh@153 -- # setup output 00:03:43.120 23:50:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.120 23:50:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:46.405 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:46.405 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:46.405 23:50:35 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:46.405 23:50:35 -- setup/hugepages.sh@89 -- # local node 00:03:46.405 23:50:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:46.405 23:50:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:46.405 23:50:35 -- setup/hugepages.sh@92 -- # local surp 00:03:46.405 23:50:35 -- setup/hugepages.sh@93 -- # local resv 00:03:46.405 23:50:35 -- setup/hugepages.sh@94 -- # local anon 00:03:46.405 23:50:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:46.405 23:50:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:46.405 23:50:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:46.405 23:50:35 -- setup/common.sh@18 -- # local node= 00:03:46.405 23:50:35 -- setup/common.sh@19 -- # local var val 00:03:46.405 23:50:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.405 23:50:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.405 23:50:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.405 23:50:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.405 23:50:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.405 23:50:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.405 23:50:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41711604 kB' 'MemAvailable: 45348616 kB' 'Buffers: 10504 kB' 'Cached: 12553204 kB' 'SwapCached: 0 kB' 'Active: 9679316 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085208 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541216 kB' 'Mapped: 154240 kB' 'Shmem: 8546824 kB' 'KReclaimable: 234100 kB' 'Slab: 697956 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463856 kB' 'KernelStack: 21616 kB' 'PageTables: 7580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10343660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213192 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.405 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.405 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.406 23:50:35 -- setup/common.sh@33 -- # echo 0 00:03:46.406 23:50:35 -- setup/common.sh@33 -- # return 0 00:03:46.406 23:50:35 -- setup/hugepages.sh@97 -- # anon=0 00:03:46.406 23:50:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:46.406 23:50:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.406 23:50:35 -- setup/common.sh@18 -- # local node= 00:03:46.406 23:50:35 -- setup/common.sh@19 -- # local var val 00:03:46.406 23:50:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.406 23:50:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.406 23:50:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.406 23:50:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.406 23:50:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.406 23:50:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41713472 kB' 'MemAvailable: 45350484 kB' 'Buffers: 10504 kB' 'Cached: 12553204 kB' 'SwapCached: 0 kB' 'Active: 9678560 kB' 'Inactive: 3422776 kB' 'Active(anon): 9084452 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 540968 kB' 'Mapped: 154132 kB' 'Shmem: 8546824 kB' 'KReclaimable: 234100 kB' 'Slab: 697956 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463856 kB' 'KernelStack: 21584 kB' 'PageTables: 7456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10343672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213160 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:46.406 23:50:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.406 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.406 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.406 23:50:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.406 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.406 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.407 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.407 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.669 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.669 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.670 23:50:36 -- setup/common.sh@33 -- # echo 0 00:03:46.670 23:50:36 -- setup/common.sh@33 -- # return 0 00:03:46.670 23:50:36 -- setup/hugepages.sh@99 -- # surp=0 00:03:46.670 23:50:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:46.670 23:50:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:46.670 23:50:36 -- setup/common.sh@18 -- # local node= 00:03:46.670 23:50:36 -- setup/common.sh@19 -- # local var val 00:03:46.670 23:50:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.670 23:50:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.670 23:50:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.670 23:50:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.670 23:50:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.670 23:50:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41713580 kB' 'MemAvailable: 45350592 kB' 'Buffers: 10504 kB' 'Cached: 12553204 kB' 'SwapCached: 0 kB' 'Active: 9678244 kB' 'Inactive: 3422776 kB' 'Active(anon): 9084136 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 540652 kB' 'Mapped: 154132 kB' 'Shmem: 8546824 kB' 'KReclaimable: 234100 kB' 'Slab: 697956 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463856 kB' 'KernelStack: 21584 kB' 'PageTables: 7456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10343688 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213160 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.670 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.670 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.671 23:50:36 -- setup/common.sh@33 -- # echo 0 00:03:46.671 23:50:36 -- setup/common.sh@33 -- # return 0 00:03:46.671 23:50:36 -- setup/hugepages.sh@100 -- # resv=0 00:03:46.671 23:50:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:46.671 nr_hugepages=1024 00:03:46.671 23:50:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:46.671 resv_hugepages=0 00:03:46.671 23:50:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:46.671 surplus_hugepages=0 00:03:46.671 23:50:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:46.671 anon_hugepages=0 00:03:46.671 23:50:36 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:46.671 23:50:36 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:46.671 23:50:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:46.671 23:50:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:46.671 23:50:36 -- setup/common.sh@18 -- # local node= 00:03:46.671 23:50:36 -- setup/common.sh@19 -- # local var val 00:03:46.671 23:50:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.671 23:50:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.671 23:50:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.671 23:50:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.671 23:50:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.671 23:50:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41713956 kB' 'MemAvailable: 45350968 kB' 'Buffers: 10504 kB' 'Cached: 12553232 kB' 'SwapCached: 0 kB' 'Active: 9678568 kB' 'Inactive: 3422776 kB' 'Active(anon): 9084460 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 540964 kB' 'Mapped: 154132 kB' 'Shmem: 8546852 kB' 'KReclaimable: 234100 kB' 'Slab: 697956 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463856 kB' 'KernelStack: 21584 kB' 'PageTables: 7456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10343700 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213160 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.671 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.671 23:50:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.672 23:50:36 -- setup/common.sh@33 -- # echo 1024 00:03:46.672 23:50:36 -- setup/common.sh@33 -- # return 0 00:03:46.672 23:50:36 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:46.672 23:50:36 -- setup/hugepages.sh@112 -- # get_nodes 00:03:46.672 23:50:36 -- setup/hugepages.sh@27 -- # local node 00:03:46.672 23:50:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.672 23:50:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.672 23:50:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.672 23:50:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.672 23:50:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:46.672 23:50:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:46.672 23:50:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.672 23:50:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.672 23:50:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:46.672 23:50:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.672 23:50:36 -- setup/common.sh@18 -- # local node=0 00:03:46.672 23:50:36 -- setup/common.sh@19 -- # local var val 00:03:46.672 23:50:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.672 23:50:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.672 23:50:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:46.672 23:50:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:46.672 23:50:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.672 23:50:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25926556 kB' 'MemUsed: 6665528 kB' 'SwapCached: 0 kB' 'Active: 3587140 kB' 'Inactive: 110720 kB' 'Active(anon): 3322400 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3460748 kB' 'Mapped: 103328 kB' 'AnonPages: 240416 kB' 'Shmem: 3085288 kB' 'KernelStack: 11320 kB' 'PageTables: 4076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111584 kB' 'Slab: 338080 kB' 'SReclaimable: 111584 kB' 'SUnreclaim: 226496 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.672 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.672 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@33 -- # echo 0 00:03:46.673 23:50:36 -- setup/common.sh@33 -- # return 0 00:03:46.673 23:50:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.673 23:50:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.673 23:50:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.673 23:50:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:46.673 23:50:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.673 23:50:36 -- setup/common.sh@18 -- # local node=1 00:03:46.673 23:50:36 -- setup/common.sh@19 -- # local var val 00:03:46.673 23:50:36 -- setup/common.sh@20 -- # local mem_f mem 00:03:46.673 23:50:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.673 23:50:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:46.673 23:50:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:46.673 23:50:36 -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.673 23:50:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 15787400 kB' 'MemUsed: 11915728 kB' 'SwapCached: 0 kB' 'Active: 6091468 kB' 'Inactive: 3312056 kB' 'Active(anon): 5762100 kB' 'Inactive(anon): 0 kB' 'Active(file): 329368 kB' 'Inactive(file): 3312056 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9103004 kB' 'Mapped: 50804 kB' 'AnonPages: 300556 kB' 'Shmem: 5461580 kB' 'KernelStack: 10264 kB' 'PageTables: 3380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122516 kB' 'Slab: 359876 kB' 'SReclaimable: 122516 kB' 'SUnreclaim: 237360 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.673 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.673 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # continue 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # IFS=': ' 00:03:46.674 23:50:36 -- setup/common.sh@31 -- # read -r var val _ 00:03:46.674 23:50:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.674 23:50:36 -- setup/common.sh@33 -- # echo 0 00:03:46.674 23:50:36 -- setup/common.sh@33 -- # return 0 00:03:46.674 23:50:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.674 23:50:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.674 23:50:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.674 23:50:36 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:46.674 node0=512 expecting 512 00:03:46.674 23:50:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.674 23:50:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.674 23:50:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.674 23:50:36 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:46.674 node1=512 expecting 512 00:03:46.674 23:50:36 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:46.674 00:03:46.674 real 0m3.692s 00:03:46.674 user 0m1.427s 00:03:46.674 sys 0m2.328s 00:03:46.674 23:50:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.674 23:50:36 -- common/autotest_common.sh@10 -- # set +x 00:03:46.674 ************************************ 00:03:46.674 END TEST even_2G_alloc 00:03:46.674 ************************************ 00:03:46.674 23:50:36 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:46.674 23:50:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:46.674 23:50:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:46.674 23:50:36 -- common/autotest_common.sh@10 -- # set +x 00:03:46.674 ************************************ 00:03:46.674 START TEST odd_alloc 00:03:46.674 ************************************ 00:03:46.674 23:50:36 -- common/autotest_common.sh@1104 -- # odd_alloc 00:03:46.674 23:50:36 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:46.674 23:50:36 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:46.674 23:50:36 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:46.674 23:50:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:46.674 23:50:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:46.674 23:50:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.674 23:50:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:46.674 23:50:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.674 23:50:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.674 23:50:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.674 23:50:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:46.674 23:50:36 -- setup/hugepages.sh@83 -- # : 513 00:03:46.674 23:50:36 -- setup/hugepages.sh@84 -- # : 1 00:03:46.674 23:50:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:46.674 23:50:36 -- setup/hugepages.sh@83 -- # : 0 00:03:46.674 23:50:36 -- setup/hugepages.sh@84 -- # : 0 00:03:46.674 23:50:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.674 23:50:36 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:46.674 23:50:36 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:46.674 23:50:36 -- setup/hugepages.sh@160 -- # setup output 00:03:46.674 23:50:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.674 23:50:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:49.961 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.961 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.961 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:50.223 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:50.223 23:50:39 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:50.223 23:50:39 -- setup/hugepages.sh@89 -- # local node 00:03:50.223 23:50:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:50.223 23:50:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:50.223 23:50:39 -- setup/hugepages.sh@92 -- # local surp 00:03:50.223 23:50:39 -- setup/hugepages.sh@93 -- # local resv 00:03:50.223 23:50:39 -- setup/hugepages.sh@94 -- # local anon 00:03:50.223 23:50:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:50.223 23:50:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:50.223 23:50:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:50.223 23:50:39 -- setup/common.sh@18 -- # local node= 00:03:50.223 23:50:39 -- setup/common.sh@19 -- # local var val 00:03:50.223 23:50:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.223 23:50:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.223 23:50:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.223 23:50:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.223 23:50:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.223 23:50:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41738352 kB' 'MemAvailable: 45375364 kB' 'Buffers: 10504 kB' 'Cached: 12553352 kB' 'SwapCached: 0 kB' 'Active: 9679612 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085504 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541456 kB' 'Mapped: 154244 kB' 'Shmem: 8546972 kB' 'KReclaimable: 234100 kB' 'Slab: 697852 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463752 kB' 'KernelStack: 21600 kB' 'PageTables: 7508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 10344588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213160 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.223 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.223 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.224 23:50:39 -- setup/common.sh@33 -- # echo 0 00:03:50.224 23:50:39 -- setup/common.sh@33 -- # return 0 00:03:50.224 23:50:39 -- setup/hugepages.sh@97 -- # anon=0 00:03:50.224 23:50:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:50.224 23:50:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.224 23:50:39 -- setup/common.sh@18 -- # local node= 00:03:50.224 23:50:39 -- setup/common.sh@19 -- # local var val 00:03:50.224 23:50:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.224 23:50:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.224 23:50:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.224 23:50:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.224 23:50:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.224 23:50:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41740400 kB' 'MemAvailable: 45377412 kB' 'Buffers: 10504 kB' 'Cached: 12553356 kB' 'SwapCached: 0 kB' 'Active: 9681008 kB' 'Inactive: 3422776 kB' 'Active(anon): 9086900 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 542920 kB' 'Mapped: 154748 kB' 'Shmem: 8546976 kB' 'KReclaimable: 234100 kB' 'Slab: 697852 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463752 kB' 'KernelStack: 21584 kB' 'PageTables: 7476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 10346220 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213096 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.224 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.224 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.225 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.225 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.226 23:50:39 -- setup/common.sh@33 -- # echo 0 00:03:50.226 23:50:39 -- setup/common.sh@33 -- # return 0 00:03:50.226 23:50:39 -- setup/hugepages.sh@99 -- # surp=0 00:03:50.226 23:50:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:50.226 23:50:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:50.226 23:50:39 -- setup/common.sh@18 -- # local node= 00:03:50.226 23:50:39 -- setup/common.sh@19 -- # local var val 00:03:50.226 23:50:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.226 23:50:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.226 23:50:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.226 23:50:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.226 23:50:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.226 23:50:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41734176 kB' 'MemAvailable: 45371188 kB' 'Buffers: 10504 kB' 'Cached: 12553368 kB' 'SwapCached: 0 kB' 'Active: 9685292 kB' 'Inactive: 3422776 kB' 'Active(anon): 9091184 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547660 kB' 'Mapped: 155060 kB' 'Shmem: 8546988 kB' 'KReclaimable: 234100 kB' 'Slab: 697820 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463720 kB' 'KernelStack: 21584 kB' 'PageTables: 7468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 10350736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213100 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.226 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.226 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.487 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.487 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.488 23:50:39 -- setup/common.sh@33 -- # echo 0 00:03:50.488 23:50:39 -- setup/common.sh@33 -- # return 0 00:03:50.488 23:50:39 -- setup/hugepages.sh@100 -- # resv=0 00:03:50.488 23:50:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:50.488 nr_hugepages=1025 00:03:50.488 23:50:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:50.488 resv_hugepages=0 00:03:50.488 23:50:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:50.488 surplus_hugepages=0 00:03:50.488 23:50:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:50.488 anon_hugepages=0 00:03:50.488 23:50:39 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:50.488 23:50:39 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:50.488 23:50:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:50.488 23:50:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:50.488 23:50:39 -- setup/common.sh@18 -- # local node= 00:03:50.488 23:50:39 -- setup/common.sh@19 -- # local var val 00:03:50.488 23:50:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.488 23:50:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.488 23:50:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.488 23:50:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.488 23:50:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.488 23:50:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.488 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.488 23:50:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41739888 kB' 'MemAvailable: 45376900 kB' 'Buffers: 10504 kB' 'Cached: 12553380 kB' 'SwapCached: 0 kB' 'Active: 9680808 kB' 'Inactive: 3422776 kB' 'Active(anon): 9086700 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 543116 kB' 'Mapped: 154648 kB' 'Shmem: 8547000 kB' 'KReclaimable: 234100 kB' 'Slab: 697820 kB' 'SReclaimable: 234100 kB' 'SUnreclaim: 463720 kB' 'KernelStack: 21584 kB' 'PageTables: 7444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486608 kB' 'Committed_AS: 10346908 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213080 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.488 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.489 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.489 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.490 23:50:39 -- setup/common.sh@33 -- # echo 1025 00:03:50.490 23:50:39 -- setup/common.sh@33 -- # return 0 00:03:50.490 23:50:39 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:50.490 23:50:39 -- setup/hugepages.sh@112 -- # get_nodes 00:03:50.490 23:50:39 -- setup/hugepages.sh@27 -- # local node 00:03:50.490 23:50:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.490 23:50:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:50.490 23:50:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.490 23:50:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:50.490 23:50:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:50.490 23:50:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:50.490 23:50:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.490 23:50:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.490 23:50:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:50.490 23:50:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.490 23:50:39 -- setup/common.sh@18 -- # local node=0 00:03:50.490 23:50:39 -- setup/common.sh@19 -- # local var val 00:03:50.490 23:50:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.490 23:50:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.490 23:50:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:50.490 23:50:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:50.490 23:50:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.490 23:50:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25933572 kB' 'MemUsed: 6658512 kB' 'SwapCached: 0 kB' 'Active: 3593168 kB' 'Inactive: 110720 kB' 'Active(anon): 3328428 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3460844 kB' 'Mapped: 104088 kB' 'AnonPages: 246388 kB' 'Shmem: 3085384 kB' 'KernelStack: 11304 kB' 'PageTables: 4040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111584 kB' 'Slab: 338068 kB' 'SReclaimable: 111584 kB' 'SUnreclaim: 226484 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.490 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.490 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@33 -- # echo 0 00:03:50.491 23:50:39 -- setup/common.sh@33 -- # return 0 00:03:50.491 23:50:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.491 23:50:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.491 23:50:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.491 23:50:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:50.491 23:50:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.491 23:50:39 -- setup/common.sh@18 -- # local node=1 00:03:50.491 23:50:39 -- setup/common.sh@19 -- # local var val 00:03:50.491 23:50:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.491 23:50:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.491 23:50:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:50.491 23:50:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:50.491 23:50:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.491 23:50:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 15806900 kB' 'MemUsed: 11896228 kB' 'SwapCached: 0 kB' 'Active: 6091860 kB' 'Inactive: 3312056 kB' 'Active(anon): 5762492 kB' 'Inactive(anon): 0 kB' 'Active(file): 329368 kB' 'Inactive(file): 3312056 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9103056 kB' 'Mapped: 50960 kB' 'AnonPages: 300924 kB' 'Shmem: 5461632 kB' 'KernelStack: 10280 kB' 'PageTables: 3432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122516 kB' 'Slab: 359752 kB' 'SReclaimable: 122516 kB' 'SUnreclaim: 237236 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.491 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.491 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # continue 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.492 23:50:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.492 23:50:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.492 23:50:39 -- setup/common.sh@33 -- # echo 0 00:03:50.492 23:50:39 -- setup/common.sh@33 -- # return 0 00:03:50.492 23:50:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.492 23:50:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.492 23:50:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.492 23:50:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.492 23:50:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:50.492 node0=512 expecting 513 00:03:50.492 23:50:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.492 23:50:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.492 23:50:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.492 23:50:39 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:50.492 node1=513 expecting 512 00:03:50.492 23:50:39 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:50.492 00:03:50.492 real 0m3.720s 00:03:50.492 user 0m1.411s 00:03:50.492 sys 0m2.374s 00:03:50.492 23:50:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:50.492 23:50:39 -- common/autotest_common.sh@10 -- # set +x 00:03:50.492 ************************************ 00:03:50.492 END TEST odd_alloc 00:03:50.492 ************************************ 00:03:50.492 23:50:39 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:50.492 23:50:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:50.492 23:50:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:50.492 23:50:39 -- common/autotest_common.sh@10 -- # set +x 00:03:50.492 ************************************ 00:03:50.492 START TEST custom_alloc 00:03:50.492 ************************************ 00:03:50.492 23:50:39 -- common/autotest_common.sh@1104 -- # custom_alloc 00:03:50.492 23:50:39 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:50.492 23:50:39 -- setup/hugepages.sh@169 -- # local node 00:03:50.492 23:50:39 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:50.492 23:50:39 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:50.492 23:50:39 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:50.492 23:50:39 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:50.492 23:50:39 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:50.492 23:50:39 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:50.492 23:50:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:50.492 23:50:39 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:50.493 23:50:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:50.493 23:50:39 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:50.493 23:50:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.493 23:50:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:50.493 23:50:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.493 23:50:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.493 23:50:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.493 23:50:39 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:50.493 23:50:39 -- setup/hugepages.sh@83 -- # : 256 00:03:50.493 23:50:39 -- setup/hugepages.sh@84 -- # : 1 00:03:50.493 23:50:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:50.493 23:50:39 -- setup/hugepages.sh@83 -- # : 0 00:03:50.493 23:50:39 -- setup/hugepages.sh@84 -- # : 0 00:03:50.493 23:50:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:50.493 23:50:39 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:50.493 23:50:39 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:50.493 23:50:39 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:50.493 23:50:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:50.493 23:50:39 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:50.493 23:50:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.493 23:50:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:50.493 23:50:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.493 23:50:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.493 23:50:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.493 23:50:39 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:50.493 23:50:39 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:50.493 23:50:39 -- setup/hugepages.sh@78 -- # return 0 00:03:50.493 23:50:39 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:50.493 23:50:39 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:50.493 23:50:39 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:50.493 23:50:39 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:50.493 23:50:39 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:50.493 23:50:39 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:50.493 23:50:39 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:50.493 23:50:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.493 23:50:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:50.493 23:50:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.493 23:50:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.493 23:50:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.493 23:50:39 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:50.493 23:50:39 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:50.493 23:50:39 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:50.493 23:50:39 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:50.493 23:50:39 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:50.493 23:50:39 -- setup/hugepages.sh@78 -- # return 0 00:03:50.493 23:50:39 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:50.493 23:50:39 -- setup/hugepages.sh@187 -- # setup output 00:03:50.493 23:50:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.493 23:50:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:53.783 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.783 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:54.046 23:50:43 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:54.046 23:50:43 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:54.046 23:50:43 -- setup/hugepages.sh@89 -- # local node 00:03:54.046 23:50:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:54.046 23:50:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:54.046 23:50:43 -- setup/hugepages.sh@92 -- # local surp 00:03:54.046 23:50:43 -- setup/hugepages.sh@93 -- # local resv 00:03:54.046 23:50:43 -- setup/hugepages.sh@94 -- # local anon 00:03:54.046 23:50:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:54.046 23:50:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:54.046 23:50:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:54.046 23:50:43 -- setup/common.sh@18 -- # local node= 00:03:54.046 23:50:43 -- setup/common.sh@19 -- # local var val 00:03:54.046 23:50:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.046 23:50:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.046 23:50:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.046 23:50:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.046 23:50:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.046 23:50:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 40695476 kB' 'MemAvailable: 44332456 kB' 'Buffers: 10504 kB' 'Cached: 12553488 kB' 'SwapCached: 0 kB' 'Active: 9679788 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085680 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541392 kB' 'Mapped: 154248 kB' 'Shmem: 8547108 kB' 'KReclaimable: 234036 kB' 'Slab: 697880 kB' 'SReclaimable: 234036 kB' 'SUnreclaim: 463844 kB' 'KernelStack: 21568 kB' 'PageTables: 7432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 10345248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213208 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.046 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.046 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.047 23:50:43 -- setup/common.sh@33 -- # echo 0 00:03:54.047 23:50:43 -- setup/common.sh@33 -- # return 0 00:03:54.047 23:50:43 -- setup/hugepages.sh@97 -- # anon=0 00:03:54.047 23:50:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:54.047 23:50:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.047 23:50:43 -- setup/common.sh@18 -- # local node= 00:03:54.047 23:50:43 -- setup/common.sh@19 -- # local var val 00:03:54.047 23:50:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.047 23:50:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.047 23:50:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.047 23:50:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.047 23:50:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.047 23:50:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 40696472 kB' 'MemAvailable: 44333440 kB' 'Buffers: 10504 kB' 'Cached: 12553492 kB' 'SwapCached: 0 kB' 'Active: 9679036 kB' 'Inactive: 3422776 kB' 'Active(anon): 9084928 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541140 kB' 'Mapped: 154148 kB' 'Shmem: 8547112 kB' 'KReclaimable: 234012 kB' 'Slab: 697844 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 463832 kB' 'KernelStack: 21584 kB' 'PageTables: 7472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 10345260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213192 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.047 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.047 23:50:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.048 23:50:43 -- setup/common.sh@33 -- # echo 0 00:03:54.048 23:50:43 -- setup/common.sh@33 -- # return 0 00:03:54.048 23:50:43 -- setup/hugepages.sh@99 -- # surp=0 00:03:54.048 23:50:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:54.048 23:50:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:54.048 23:50:43 -- setup/common.sh@18 -- # local node= 00:03:54.048 23:50:43 -- setup/common.sh@19 -- # local var val 00:03:54.048 23:50:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.048 23:50:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.048 23:50:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.048 23:50:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.048 23:50:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.048 23:50:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 40696164 kB' 'MemAvailable: 44333132 kB' 'Buffers: 10504 kB' 'Cached: 12553504 kB' 'SwapCached: 0 kB' 'Active: 9679052 kB' 'Inactive: 3422776 kB' 'Active(anon): 9084944 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541144 kB' 'Mapped: 154148 kB' 'Shmem: 8547124 kB' 'KReclaimable: 234012 kB' 'Slab: 697844 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 463832 kB' 'KernelStack: 21584 kB' 'PageTables: 7472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 10345276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213192 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.048 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.048 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.049 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.049 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.050 23:50:43 -- setup/common.sh@33 -- # echo 0 00:03:54.050 23:50:43 -- setup/common.sh@33 -- # return 0 00:03:54.050 23:50:43 -- setup/hugepages.sh@100 -- # resv=0 00:03:54.050 23:50:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:54.050 nr_hugepages=1536 00:03:54.050 23:50:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:54.050 resv_hugepages=0 00:03:54.050 23:50:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:54.050 surplus_hugepages=0 00:03:54.050 23:50:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:54.050 anon_hugepages=0 00:03:54.050 23:50:43 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:54.050 23:50:43 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:54.050 23:50:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:54.050 23:50:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:54.050 23:50:43 -- setup/common.sh@18 -- # local node= 00:03:54.050 23:50:43 -- setup/common.sh@19 -- # local var val 00:03:54.050 23:50:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.050 23:50:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.050 23:50:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.050 23:50:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.050 23:50:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.050 23:50:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.050 23:50:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 40695912 kB' 'MemAvailable: 44332880 kB' 'Buffers: 10504 kB' 'Cached: 12553528 kB' 'SwapCached: 0 kB' 'Active: 9678704 kB' 'Inactive: 3422776 kB' 'Active(anon): 9084596 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 540752 kB' 'Mapped: 154148 kB' 'Shmem: 8547148 kB' 'KReclaimable: 234012 kB' 'Slab: 697844 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 463832 kB' 'KernelStack: 21568 kB' 'PageTables: 7420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963344 kB' 'Committed_AS: 10345288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213192 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.050 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.050 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.051 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.051 23:50:43 -- setup/common.sh@33 -- # echo 1536 00:03:54.051 23:50:43 -- setup/common.sh@33 -- # return 0 00:03:54.051 23:50:43 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:54.051 23:50:43 -- setup/hugepages.sh@112 -- # get_nodes 00:03:54.051 23:50:43 -- setup/hugepages.sh@27 -- # local node 00:03:54.051 23:50:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.051 23:50:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:54.051 23:50:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.051 23:50:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.051 23:50:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.051 23:50:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.051 23:50:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:54.051 23:50:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:54.051 23:50:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:54.051 23:50:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.051 23:50:43 -- setup/common.sh@18 -- # local node=0 00:03:54.051 23:50:43 -- setup/common.sh@19 -- # local var val 00:03:54.051 23:50:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.051 23:50:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.051 23:50:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:54.051 23:50:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:54.051 23:50:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.051 23:50:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.051 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25939116 kB' 'MemUsed: 6652968 kB' 'SwapCached: 0 kB' 'Active: 3587836 kB' 'Inactive: 110720 kB' 'Active(anon): 3323096 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3460896 kB' 'Mapped: 103344 kB' 'AnonPages: 240904 kB' 'Shmem: 3085436 kB' 'KernelStack: 11304 kB' 'PageTables: 4092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111584 kB' 'Slab: 338068 kB' 'SReclaimable: 111584 kB' 'SUnreclaim: 226484 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.052 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.052 23:50:43 -- setup/common.sh@33 -- # echo 0 00:03:54.052 23:50:43 -- setup/common.sh@33 -- # return 0 00:03:54.052 23:50:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:54.052 23:50:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:54.052 23:50:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:54.052 23:50:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:54.052 23:50:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.052 23:50:43 -- setup/common.sh@18 -- # local node=1 00:03:54.052 23:50:43 -- setup/common.sh@19 -- # local var val 00:03:54.052 23:50:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.052 23:50:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.052 23:50:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:54.052 23:50:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:54.052 23:50:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.052 23:50:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.052 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.053 23:50:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703128 kB' 'MemFree: 14759236 kB' 'MemUsed: 12943892 kB' 'SwapCached: 0 kB' 'Active: 6091604 kB' 'Inactive: 3312056 kB' 'Active(anon): 5762236 kB' 'Inactive(anon): 0 kB' 'Active(file): 329368 kB' 'Inactive(file): 3312056 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9103152 kB' 'Mapped: 50804 kB' 'AnonPages: 300552 kB' 'Shmem: 5461728 kB' 'KernelStack: 10296 kB' 'PageTables: 3532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122428 kB' 'Slab: 359776 kB' 'SReclaimable: 122428 kB' 'SUnreclaim: 237348 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.053 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.053 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.311 23:50:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.311 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.311 23:50:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.311 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.311 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.311 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.311 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # continue 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.312 23:50:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.312 23:50:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.312 23:50:43 -- setup/common.sh@33 -- # echo 0 00:03:54.312 23:50:43 -- setup/common.sh@33 -- # return 0 00:03:54.312 23:50:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:54.312 23:50:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:54.312 23:50:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:54.312 23:50:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:54.312 23:50:43 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:54.312 node0=512 expecting 512 00:03:54.312 23:50:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:54.312 23:50:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:54.312 23:50:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:54.312 23:50:43 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:54.312 node1=1024 expecting 1024 00:03:54.312 23:50:43 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:54.312 00:03:54.312 real 0m3.698s 00:03:54.312 user 0m1.406s 00:03:54.312 sys 0m2.349s 00:03:54.312 23:50:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.312 23:50:43 -- common/autotest_common.sh@10 -- # set +x 00:03:54.312 ************************************ 00:03:54.312 END TEST custom_alloc 00:03:54.312 ************************************ 00:03:54.312 23:50:43 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:54.312 23:50:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:54.312 23:50:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:54.312 23:50:43 -- common/autotest_common.sh@10 -- # set +x 00:03:54.312 ************************************ 00:03:54.312 START TEST no_shrink_alloc 00:03:54.312 ************************************ 00:03:54.312 23:50:43 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:03:54.312 23:50:43 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:54.312 23:50:43 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:54.312 23:50:43 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:54.312 23:50:43 -- setup/hugepages.sh@51 -- # shift 00:03:54.312 23:50:43 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:54.312 23:50:43 -- setup/hugepages.sh@52 -- # local node_ids 00:03:54.312 23:50:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:54.312 23:50:43 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:54.312 23:50:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:54.312 23:50:43 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:54.312 23:50:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:54.312 23:50:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:54.312 23:50:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:54.312 23:50:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:54.312 23:50:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:54.312 23:50:43 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:54.312 23:50:43 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.312 23:50:43 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:54.312 23:50:43 -- setup/hugepages.sh@73 -- # return 0 00:03:54.312 23:50:43 -- setup/hugepages.sh@198 -- # setup output 00:03:54.312 23:50:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.312 23:50:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:57.597 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.597 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:57.597 23:50:47 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:57.597 23:50:47 -- setup/hugepages.sh@89 -- # local node 00:03:57.597 23:50:47 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:57.597 23:50:47 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:57.597 23:50:47 -- setup/hugepages.sh@92 -- # local surp 00:03:57.597 23:50:47 -- setup/hugepages.sh@93 -- # local resv 00:03:57.597 23:50:47 -- setup/hugepages.sh@94 -- # local anon 00:03:57.597 23:50:47 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:57.597 23:50:47 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:57.597 23:50:47 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:57.597 23:50:47 -- setup/common.sh@18 -- # local node= 00:03:57.597 23:50:47 -- setup/common.sh@19 -- # local var val 00:03:57.597 23:50:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.597 23:50:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.597 23:50:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.597 23:50:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.597 23:50:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.597 23:50:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41782792 kB' 'MemAvailable: 45419760 kB' 'Buffers: 10504 kB' 'Cached: 12553616 kB' 'SwapCached: 0 kB' 'Active: 9682100 kB' 'Inactive: 3422776 kB' 'Active(anon): 9087992 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 543976 kB' 'Mapped: 154188 kB' 'Shmem: 8547236 kB' 'KReclaimable: 234012 kB' 'Slab: 698028 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 464016 kB' 'KernelStack: 21760 kB' 'PageTables: 7712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10348808 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213416 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.597 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.597 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.598 23:50:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.598 23:50:47 -- setup/common.sh@33 -- # echo 0 00:03:57.598 23:50:47 -- setup/common.sh@33 -- # return 0 00:03:57.598 23:50:47 -- setup/hugepages.sh@97 -- # anon=0 00:03:57.598 23:50:47 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:57.598 23:50:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.598 23:50:47 -- setup/common.sh@18 -- # local node= 00:03:57.598 23:50:47 -- setup/common.sh@19 -- # local var val 00:03:57.598 23:50:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.598 23:50:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.598 23:50:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.598 23:50:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.598 23:50:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.598 23:50:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.598 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41783840 kB' 'MemAvailable: 45420808 kB' 'Buffers: 10504 kB' 'Cached: 12553620 kB' 'SwapCached: 0 kB' 'Active: 9681756 kB' 'Inactive: 3422776 kB' 'Active(anon): 9087648 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 543628 kB' 'Mapped: 154156 kB' 'Shmem: 8547240 kB' 'KReclaimable: 234012 kB' 'Slab: 697992 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 463980 kB' 'KernelStack: 21712 kB' 'PageTables: 7796 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10350068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213352 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.599 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.599 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.860 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.860 23:50:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.861 23:50:47 -- setup/common.sh@33 -- # echo 0 00:03:57.861 23:50:47 -- setup/common.sh@33 -- # return 0 00:03:57.861 23:50:47 -- setup/hugepages.sh@99 -- # surp=0 00:03:57.861 23:50:47 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:57.861 23:50:47 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:57.861 23:50:47 -- setup/common.sh@18 -- # local node= 00:03:57.861 23:50:47 -- setup/common.sh@19 -- # local var val 00:03:57.861 23:50:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.861 23:50:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.861 23:50:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.861 23:50:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.861 23:50:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.861 23:50:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41783232 kB' 'MemAvailable: 45420200 kB' 'Buffers: 10504 kB' 'Cached: 12553632 kB' 'SwapCached: 0 kB' 'Active: 9682208 kB' 'Inactive: 3422776 kB' 'Active(anon): 9088100 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 544096 kB' 'Mapped: 154148 kB' 'Shmem: 8547252 kB' 'KReclaimable: 234012 kB' 'Slab: 697864 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 463852 kB' 'KernelStack: 21744 kB' 'PageTables: 8164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10350320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213416 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.861 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.861 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.862 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.862 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.863 23:50:47 -- setup/common.sh@33 -- # echo 0 00:03:57.863 23:50:47 -- setup/common.sh@33 -- # return 0 00:03:57.863 23:50:47 -- setup/hugepages.sh@100 -- # resv=0 00:03:57.863 23:50:47 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:57.863 nr_hugepages=1024 00:03:57.863 23:50:47 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:57.863 resv_hugepages=0 00:03:57.863 23:50:47 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:57.863 surplus_hugepages=0 00:03:57.863 23:50:47 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:57.863 anon_hugepages=0 00:03:57.863 23:50:47 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:57.863 23:50:47 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:57.863 23:50:47 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:57.863 23:50:47 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:57.863 23:50:47 -- setup/common.sh@18 -- # local node= 00:03:57.863 23:50:47 -- setup/common.sh@19 -- # local var val 00:03:57.863 23:50:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.863 23:50:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.863 23:50:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.863 23:50:47 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.863 23:50:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.863 23:50:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41785824 kB' 'MemAvailable: 45422792 kB' 'Buffers: 10504 kB' 'Cached: 12553644 kB' 'SwapCached: 0 kB' 'Active: 9682144 kB' 'Inactive: 3422776 kB' 'Active(anon): 9088036 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 544140 kB' 'Mapped: 154156 kB' 'Shmem: 8547264 kB' 'KReclaimable: 234012 kB' 'Slab: 697864 kB' 'SReclaimable: 234012 kB' 'SUnreclaim: 463852 kB' 'KernelStack: 21664 kB' 'PageTables: 7692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10345808 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213320 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.863 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.863 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.864 23:50:47 -- setup/common.sh@33 -- # echo 1024 00:03:57.864 23:50:47 -- setup/common.sh@33 -- # return 0 00:03:57.864 23:50:47 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:57.864 23:50:47 -- setup/hugepages.sh@112 -- # get_nodes 00:03:57.864 23:50:47 -- setup/hugepages.sh@27 -- # local node 00:03:57.864 23:50:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.864 23:50:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:57.864 23:50:47 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.864 23:50:47 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:57.864 23:50:47 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:57.864 23:50:47 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:57.864 23:50:47 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:57.864 23:50:47 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:57.864 23:50:47 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:57.864 23:50:47 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.864 23:50:47 -- setup/common.sh@18 -- # local node=0 00:03:57.864 23:50:47 -- setup/common.sh@19 -- # local var val 00:03:57.864 23:50:47 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.864 23:50:47 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.864 23:50:47 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:57.864 23:50:47 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:57.864 23:50:47 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.864 23:50:47 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.864 23:50:47 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24930528 kB' 'MemUsed: 7661556 kB' 'SwapCached: 0 kB' 'Active: 3589116 kB' 'Inactive: 110720 kB' 'Active(anon): 3324376 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3460976 kB' 'Mapped: 103352 kB' 'AnonPages: 242072 kB' 'Shmem: 3085516 kB' 'KernelStack: 11272 kB' 'PageTables: 3884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111584 kB' 'Slab: 338264 kB' 'SReclaimable: 111584 kB' 'SUnreclaim: 226680 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.864 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.864 23:50:47 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # continue 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.865 23:50:47 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.865 23:50:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.865 23:50:47 -- setup/common.sh@33 -- # echo 0 00:03:57.865 23:50:47 -- setup/common.sh@33 -- # return 0 00:03:57.865 23:50:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:57.865 23:50:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:57.865 23:50:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:57.865 23:50:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:57.865 23:50:47 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:57.865 node0=1024 expecting 1024 00:03:57.865 23:50:47 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:57.865 23:50:47 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:57.865 23:50:47 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:57.865 23:50:47 -- setup/hugepages.sh@202 -- # setup output 00:03:57.865 23:50:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:57.866 23:50:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:01.154 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.154 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:01.154 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:01.416 23:50:50 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:01.416 23:50:50 -- setup/hugepages.sh@89 -- # local node 00:04:01.416 23:50:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:01.416 23:50:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:01.416 23:50:50 -- setup/hugepages.sh@92 -- # local surp 00:04:01.416 23:50:50 -- setup/hugepages.sh@93 -- # local resv 00:04:01.416 23:50:50 -- setup/hugepages.sh@94 -- # local anon 00:04:01.416 23:50:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:01.416 23:50:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:01.416 23:50:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:01.416 23:50:50 -- setup/common.sh@18 -- # local node= 00:04:01.416 23:50:50 -- setup/common.sh@19 -- # local var val 00:04:01.416 23:50:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.416 23:50:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.416 23:50:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.416 23:50:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.416 23:50:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.416 23:50:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.416 23:50:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41792024 kB' 'MemAvailable: 45428968 kB' 'Buffers: 10504 kB' 'Cached: 12553736 kB' 'SwapCached: 0 kB' 'Active: 9680412 kB' 'Inactive: 3422776 kB' 'Active(anon): 9086304 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 542324 kB' 'Mapped: 154280 kB' 'Shmem: 8547356 kB' 'KReclaimable: 233964 kB' 'Slab: 697820 kB' 'SReclaimable: 233964 kB' 'SUnreclaim: 463856 kB' 'KernelStack: 21600 kB' 'PageTables: 7480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10346544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213256 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.416 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.416 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.417 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.417 23:50:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.417 23:50:50 -- setup/common.sh@33 -- # echo 0 00:04:01.417 23:50:50 -- setup/common.sh@33 -- # return 0 00:04:01.417 23:50:50 -- setup/hugepages.sh@97 -- # anon=0 00:04:01.417 23:50:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:01.417 23:50:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.417 23:50:50 -- setup/common.sh@18 -- # local node= 00:04:01.417 23:50:50 -- setup/common.sh@19 -- # local var val 00:04:01.417 23:50:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.417 23:50:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.417 23:50:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.417 23:50:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.417 23:50:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.417 23:50:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.418 23:50:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41792512 kB' 'MemAvailable: 45429440 kB' 'Buffers: 10504 kB' 'Cached: 12553740 kB' 'SwapCached: 0 kB' 'Active: 9680036 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085928 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541884 kB' 'Mapped: 154160 kB' 'Shmem: 8547360 kB' 'KReclaimable: 233932 kB' 'Slab: 697772 kB' 'SReclaimable: 233932 kB' 'SUnreclaim: 463840 kB' 'KernelStack: 21600 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10346556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213224 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.418 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.418 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.419 23:50:50 -- setup/common.sh@33 -- # echo 0 00:04:01.419 23:50:50 -- setup/common.sh@33 -- # return 0 00:04:01.419 23:50:50 -- setup/hugepages.sh@99 -- # surp=0 00:04:01.419 23:50:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:01.419 23:50:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:01.419 23:50:50 -- setup/common.sh@18 -- # local node= 00:04:01.419 23:50:50 -- setup/common.sh@19 -- # local var val 00:04:01.419 23:50:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.419 23:50:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.419 23:50:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.419 23:50:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.419 23:50:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.419 23:50:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41792512 kB' 'MemAvailable: 45429440 kB' 'Buffers: 10504 kB' 'Cached: 12553752 kB' 'SwapCached: 0 kB' 'Active: 9679648 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085540 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541436 kB' 'Mapped: 154160 kB' 'Shmem: 8547372 kB' 'KReclaimable: 233932 kB' 'Slab: 697772 kB' 'SReclaimable: 233932 kB' 'SUnreclaim: 463840 kB' 'KernelStack: 21584 kB' 'PageTables: 7412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10346572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213240 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.419 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.419 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.420 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.420 23:50:50 -- setup/common.sh@33 -- # echo 0 00:04:01.420 23:50:50 -- setup/common.sh@33 -- # return 0 00:04:01.420 23:50:50 -- setup/hugepages.sh@100 -- # resv=0 00:04:01.420 23:50:50 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:01.420 nr_hugepages=1024 00:04:01.420 23:50:50 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:01.420 resv_hugepages=0 00:04:01.420 23:50:50 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:01.420 surplus_hugepages=0 00:04:01.420 23:50:50 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:01.420 anon_hugepages=0 00:04:01.420 23:50:50 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:01.420 23:50:50 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:01.420 23:50:50 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:01.420 23:50:50 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:01.420 23:50:50 -- setup/common.sh@18 -- # local node= 00:04:01.420 23:50:50 -- setup/common.sh@19 -- # local var val 00:04:01.420 23:50:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.420 23:50:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.420 23:50:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.420 23:50:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.420 23:50:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.420 23:50:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.420 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295212 kB' 'MemFree: 41791808 kB' 'MemAvailable: 45428736 kB' 'Buffers: 10504 kB' 'Cached: 12553764 kB' 'SwapCached: 0 kB' 'Active: 9680068 kB' 'Inactive: 3422776 kB' 'Active(anon): 9085960 kB' 'Inactive(anon): 0 kB' 'Active(file): 594108 kB' 'Inactive(file): 3422776 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 541884 kB' 'Mapped: 154160 kB' 'Shmem: 8547384 kB' 'KReclaimable: 233932 kB' 'Slab: 697772 kB' 'SReclaimable: 233932 kB' 'SUnreclaim: 463840 kB' 'KernelStack: 21600 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487632 kB' 'Committed_AS: 10346584 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213240 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 492916 kB' 'DirectMap2M: 9678848 kB' 'DirectMap1G: 58720256 kB' 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.421 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.421 23:50:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.422 23:50:50 -- setup/common.sh@33 -- # echo 1024 00:04:01.422 23:50:50 -- setup/common.sh@33 -- # return 0 00:04:01.422 23:50:50 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:01.422 23:50:50 -- setup/hugepages.sh@112 -- # get_nodes 00:04:01.422 23:50:50 -- setup/hugepages.sh@27 -- # local node 00:04:01.422 23:50:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.422 23:50:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:01.422 23:50:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.422 23:50:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:01.422 23:50:50 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:01.422 23:50:50 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:01.422 23:50:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:01.422 23:50:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:01.422 23:50:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:01.422 23:50:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.422 23:50:50 -- setup/common.sh@18 -- # local node=0 00:04:01.422 23:50:50 -- setup/common.sh@19 -- # local var val 00:04:01.422 23:50:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.422 23:50:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.422 23:50:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:01.422 23:50:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:01.422 23:50:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.422 23:50:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.422 23:50:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 24904664 kB' 'MemUsed: 7687420 kB' 'SwapCached: 0 kB' 'Active: 3588484 kB' 'Inactive: 110720 kB' 'Active(anon): 3323744 kB' 'Inactive(anon): 0 kB' 'Active(file): 264740 kB' 'Inactive(file): 110720 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3461040 kB' 'Mapped: 103356 kB' 'AnonPages: 241416 kB' 'Shmem: 3085580 kB' 'KernelStack: 11304 kB' 'PageTables: 4040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111568 kB' 'Slab: 338100 kB' 'SReclaimable: 111568 kB' 'SUnreclaim: 226532 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.422 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.422 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # continue 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.423 23:50:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.423 23:50:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.423 23:50:50 -- setup/common.sh@33 -- # echo 0 00:04:01.423 23:50:50 -- setup/common.sh@33 -- # return 0 00:04:01.423 23:50:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:01.423 23:50:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:01.423 23:50:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:01.423 23:50:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:01.423 23:50:50 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:01.423 node0=1024 expecting 1024 00:04:01.423 23:50:50 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:01.423 00:04:01.423 real 0m7.227s 00:04:01.423 user 0m2.734s 00:04:01.423 sys 0m4.622s 00:04:01.423 23:50:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:01.423 23:50:50 -- common/autotest_common.sh@10 -- # set +x 00:04:01.423 ************************************ 00:04:01.423 END TEST no_shrink_alloc 00:04:01.423 ************************************ 00:04:01.423 23:50:50 -- setup/hugepages.sh@217 -- # clear_hp 00:04:01.423 23:50:50 -- setup/hugepages.sh@37 -- # local node hp 00:04:01.423 23:50:50 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:01.423 23:50:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.423 23:50:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:01.423 23:50:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.423 23:50:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:01.423 23:50:50 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:01.423 23:50:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.423 23:50:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:01.423 23:50:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:01.423 23:50:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:01.423 23:50:50 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:01.423 23:50:50 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:01.423 00:04:01.423 real 0m27.552s 00:04:01.423 user 0m9.831s 00:04:01.423 sys 0m16.655s 00:04:01.423 23:50:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:01.423 23:50:50 -- common/autotest_common.sh@10 -- # set +x 00:04:01.423 ************************************ 00:04:01.423 END TEST hugepages 00:04:01.423 ************************************ 00:04:01.682 23:50:51 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:01.682 23:50:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:01.682 23:50:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:01.682 23:50:51 -- common/autotest_common.sh@10 -- # set +x 00:04:01.682 ************************************ 00:04:01.682 START TEST driver 00:04:01.682 ************************************ 00:04:01.682 23:50:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:01.682 * Looking for test storage... 00:04:01.682 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:01.682 23:50:51 -- setup/driver.sh@68 -- # setup reset 00:04:01.682 23:50:51 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:01.682 23:50:51 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:06.951 23:50:55 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:06.951 23:50:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:06.951 23:50:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:06.951 23:50:55 -- common/autotest_common.sh@10 -- # set +x 00:04:06.951 ************************************ 00:04:06.951 START TEST guess_driver 00:04:06.951 ************************************ 00:04:06.951 23:50:55 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:06.951 23:50:55 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:06.951 23:50:55 -- setup/driver.sh@47 -- # local fail=0 00:04:06.951 23:50:55 -- setup/driver.sh@49 -- # pick_driver 00:04:06.951 23:50:55 -- setup/driver.sh@36 -- # vfio 00:04:06.951 23:50:55 -- setup/driver.sh@21 -- # local iommu_grups 00:04:06.951 23:50:55 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:06.951 23:50:55 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:06.951 23:50:55 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:06.951 23:50:55 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:06.951 23:50:55 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:06.951 23:50:55 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:06.951 23:50:55 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:06.951 23:50:55 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:06.951 23:50:55 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:06.951 23:50:55 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:06.951 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:06.951 23:50:55 -- setup/driver.sh@30 -- # return 0 00:04:06.951 23:50:55 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:06.951 23:50:55 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:06.951 23:50:55 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:06.951 23:50:55 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:06.951 Looking for driver=vfio-pci 00:04:06.951 23:50:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:06.951 23:50:55 -- setup/driver.sh@45 -- # setup output config 00:04:06.951 23:50:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.951 23:50:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:10.238 23:50:59 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:10.238 23:50:59 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:10.238 23:50:59 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:11.614 23:51:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:11.614 23:51:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:11.614 23:51:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:11.614 23:51:00 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:11.614 23:51:00 -- setup/driver.sh@65 -- # setup reset 00:04:11.614 23:51:00 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:11.614 23:51:00 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:16.882 00:04:16.883 real 0m10.034s 00:04:16.883 user 0m2.690s 00:04:16.883 sys 0m5.094s 00:04:16.883 23:51:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:16.883 23:51:05 -- common/autotest_common.sh@10 -- # set +x 00:04:16.883 ************************************ 00:04:16.883 END TEST guess_driver 00:04:16.883 ************************************ 00:04:16.883 00:04:16.883 real 0m14.724s 00:04:16.883 user 0m3.939s 00:04:16.883 sys 0m7.652s 00:04:16.883 23:51:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:16.883 23:51:05 -- common/autotest_common.sh@10 -- # set +x 00:04:16.883 ************************************ 00:04:16.883 END TEST driver 00:04:16.883 ************************************ 00:04:16.883 23:51:05 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:16.883 23:51:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:16.883 23:51:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:16.883 23:51:05 -- common/autotest_common.sh@10 -- # set +x 00:04:16.883 ************************************ 00:04:16.883 START TEST devices 00:04:16.883 ************************************ 00:04:16.883 23:51:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:16.883 * Looking for test storage... 00:04:16.883 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:16.883 23:51:05 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:16.883 23:51:05 -- setup/devices.sh@192 -- # setup reset 00:04:16.883 23:51:05 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:16.883 23:51:05 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:20.171 23:51:09 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:20.171 23:51:09 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:20.171 23:51:09 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:20.171 23:51:09 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:20.171 23:51:09 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:20.171 23:51:09 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:20.171 23:51:09 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:20.171 23:51:09 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:20.171 23:51:09 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:20.171 23:51:09 -- setup/devices.sh@196 -- # blocks=() 00:04:20.171 23:51:09 -- setup/devices.sh@196 -- # declare -a blocks 00:04:20.171 23:51:09 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:20.172 23:51:09 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:20.172 23:51:09 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:20.172 23:51:09 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:20.172 23:51:09 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:20.172 23:51:09 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:20.172 23:51:09 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:20.172 23:51:09 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:20.172 23:51:09 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:20.172 23:51:09 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:20.172 23:51:09 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:20.172 No valid GPT data, bailing 00:04:20.172 23:51:09 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:20.172 23:51:09 -- scripts/common.sh@393 -- # pt= 00:04:20.172 23:51:09 -- scripts/common.sh@394 -- # return 1 00:04:20.172 23:51:09 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:20.172 23:51:09 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:20.172 23:51:09 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:20.172 23:51:09 -- setup/common.sh@80 -- # echo 1600321314816 00:04:20.172 23:51:09 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:20.172 23:51:09 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:20.172 23:51:09 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:20.172 23:51:09 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:20.172 23:51:09 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:20.172 23:51:09 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:20.172 23:51:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:20.172 23:51:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:20.172 23:51:09 -- common/autotest_common.sh@10 -- # set +x 00:04:20.172 ************************************ 00:04:20.172 START TEST nvme_mount 00:04:20.172 ************************************ 00:04:20.172 23:51:09 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:20.172 23:51:09 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:20.172 23:51:09 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:20.172 23:51:09 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:20.172 23:51:09 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:20.172 23:51:09 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:20.172 23:51:09 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:20.172 23:51:09 -- setup/common.sh@40 -- # local part_no=1 00:04:20.172 23:51:09 -- setup/common.sh@41 -- # local size=1073741824 00:04:20.172 23:51:09 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:20.172 23:51:09 -- setup/common.sh@44 -- # parts=() 00:04:20.172 23:51:09 -- setup/common.sh@44 -- # local parts 00:04:20.172 23:51:09 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:20.172 23:51:09 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:20.172 23:51:09 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:20.172 23:51:09 -- setup/common.sh@46 -- # (( part++ )) 00:04:20.172 23:51:09 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:20.172 23:51:09 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:20.172 23:51:09 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:20.172 23:51:09 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:21.108 Creating new GPT entries in memory. 00:04:21.108 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:21.108 other utilities. 00:04:21.108 23:51:10 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:21.108 23:51:10 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:21.108 23:51:10 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:21.108 23:51:10 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:21.108 23:51:10 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:22.486 Creating new GPT entries in memory. 00:04:22.486 The operation has completed successfully. 00:04:22.486 23:51:11 -- setup/common.sh@57 -- # (( part++ )) 00:04:22.486 23:51:11 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:22.486 23:51:11 -- setup/common.sh@62 -- # wait 432205 00:04:22.486 23:51:11 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.486 23:51:11 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:22.486 23:51:11 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.486 23:51:11 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:22.486 23:51:11 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:22.486 23:51:11 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.486 23:51:11 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.486 23:51:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:22.486 23:51:11 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:22.486 23:51:11 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.486 23:51:11 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.486 23:51:11 -- setup/devices.sh@53 -- # local found=0 00:04:22.486 23:51:11 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:22.486 23:51:11 -- setup/devices.sh@56 -- # : 00:04:22.486 23:51:11 -- setup/devices.sh@59 -- # local pci status 00:04:22.486 23:51:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.486 23:51:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:22.486 23:51:11 -- setup/devices.sh@47 -- # setup output config 00:04:22.486 23:51:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.486 23:51:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:25.772 23:51:14 -- setup/devices.sh@63 -- # found=1 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:25.772 23:51:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.772 23:51:15 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:25.772 23:51:15 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:25.772 23:51:15 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:25.772 23:51:15 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:25.772 23:51:15 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:25.772 23:51:15 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:25.772 23:51:15 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:25.772 23:51:15 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:25.772 23:51:15 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:25.772 23:51:15 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:25.772 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:25.772 23:51:15 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:25.772 23:51:15 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:26.031 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:26.031 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:26.031 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:26.031 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:26.031 23:51:15 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:26.031 23:51:15 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:26.031 23:51:15 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.031 23:51:15 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:26.031 23:51:15 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:26.031 23:51:15 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.031 23:51:15 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:26.031 23:51:15 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:26.031 23:51:15 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:26.031 23:51:15 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.031 23:51:15 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:26.031 23:51:15 -- setup/devices.sh@53 -- # local found=0 00:04:26.031 23:51:15 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:26.031 23:51:15 -- setup/devices.sh@56 -- # : 00:04:26.031 23:51:15 -- setup/devices.sh@59 -- # local pci status 00:04:26.031 23:51:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.031 23:51:15 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:26.031 23:51:15 -- setup/devices.sh@47 -- # setup output config 00:04:26.031 23:51:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.031 23:51:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:29.317 23:51:18 -- setup/devices.sh@63 -- # found=1 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:29.317 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.317 23:51:18 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:29.317 23:51:18 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:29.318 23:51:18 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:29.318 23:51:18 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:29.318 23:51:18 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:29.318 23:51:18 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:29.318 23:51:18 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:29.318 23:51:18 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:29.318 23:51:18 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:29.318 23:51:18 -- setup/devices.sh@50 -- # local mount_point= 00:04:29.318 23:51:18 -- setup/devices.sh@51 -- # local test_file= 00:04:29.318 23:51:18 -- setup/devices.sh@53 -- # local found=0 00:04:29.318 23:51:18 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:29.318 23:51:18 -- setup/devices.sh@59 -- # local pci status 00:04:29.318 23:51:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.318 23:51:18 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:29.318 23:51:18 -- setup/devices.sh@47 -- # setup output config 00:04:29.318 23:51:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:29.318 23:51:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:32.760 23:51:22 -- setup/devices.sh@63 -- # found=1 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.760 23:51:22 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:32.760 23:51:22 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:32.760 23:51:22 -- setup/devices.sh@68 -- # return 0 00:04:32.760 23:51:22 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:32.760 23:51:22 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:32.760 23:51:22 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:32.760 23:51:22 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:32.760 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:32.760 00:04:32.760 real 0m12.631s 00:04:32.760 user 0m3.776s 00:04:32.760 sys 0m6.754s 00:04:32.760 23:51:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:32.760 23:51:22 -- common/autotest_common.sh@10 -- # set +x 00:04:32.760 ************************************ 00:04:32.760 END TEST nvme_mount 00:04:32.760 ************************************ 00:04:32.760 23:51:22 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:32.760 23:51:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:32.760 23:51:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:32.760 23:51:22 -- common/autotest_common.sh@10 -- # set +x 00:04:32.760 ************************************ 00:04:32.760 START TEST dm_mount 00:04:32.761 ************************************ 00:04:32.761 23:51:22 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:32.761 23:51:22 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:32.761 23:51:22 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:32.761 23:51:22 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:32.761 23:51:22 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:32.761 23:51:22 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:32.761 23:51:22 -- setup/common.sh@40 -- # local part_no=2 00:04:32.761 23:51:22 -- setup/common.sh@41 -- # local size=1073741824 00:04:32.761 23:51:22 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:32.761 23:51:22 -- setup/common.sh@44 -- # parts=() 00:04:32.761 23:51:22 -- setup/common.sh@44 -- # local parts 00:04:32.761 23:51:22 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:32.761 23:51:22 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:32.761 23:51:22 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:32.761 23:51:22 -- setup/common.sh@46 -- # (( part++ )) 00:04:32.761 23:51:22 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:32.761 23:51:22 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:32.761 23:51:22 -- setup/common.sh@46 -- # (( part++ )) 00:04:32.761 23:51:22 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:32.761 23:51:22 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:32.761 23:51:22 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:32.761 23:51:22 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:34.137 Creating new GPT entries in memory. 00:04:34.137 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:34.137 other utilities. 00:04:34.137 23:51:23 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:34.137 23:51:23 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:34.137 23:51:23 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:34.137 23:51:23 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:34.137 23:51:23 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:35.074 Creating new GPT entries in memory. 00:04:35.074 The operation has completed successfully. 00:04:35.074 23:51:24 -- setup/common.sh@57 -- # (( part++ )) 00:04:35.074 23:51:24 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:35.074 23:51:24 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:35.074 23:51:24 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:35.074 23:51:24 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:36.011 The operation has completed successfully. 00:04:36.011 23:51:25 -- setup/common.sh@57 -- # (( part++ )) 00:04:36.011 23:51:25 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:36.011 23:51:25 -- setup/common.sh@62 -- # wait 436719 00:04:36.011 23:51:25 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:36.011 23:51:25 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.011 23:51:25 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.011 23:51:25 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:36.011 23:51:25 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:36.011 23:51:25 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:36.011 23:51:25 -- setup/devices.sh@161 -- # break 00:04:36.011 23:51:25 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:36.011 23:51:25 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:36.011 23:51:25 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:36.011 23:51:25 -- setup/devices.sh@166 -- # dm=dm-0 00:04:36.011 23:51:25 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:36.011 23:51:25 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:36.011 23:51:25 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.011 23:51:25 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:36.011 23:51:25 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.011 23:51:25 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:36.011 23:51:25 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:36.011 23:51:25 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.011 23:51:25 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.011 23:51:25 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:36.011 23:51:25 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:36.011 23:51:25 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.011 23:51:25 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.011 23:51:25 -- setup/devices.sh@53 -- # local found=0 00:04:36.011 23:51:25 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:36.011 23:51:25 -- setup/devices.sh@56 -- # : 00:04:36.011 23:51:25 -- setup/devices.sh@59 -- # local pci status 00:04:36.011 23:51:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.011 23:51:25 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:36.011 23:51:25 -- setup/devices.sh@47 -- # setup output config 00:04:36.011 23:51:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.011 23:51:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:39.302 23:51:28 -- setup/devices.sh@63 -- # found=1 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.302 23:51:28 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.302 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.562 23:51:28 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:39.562 23:51:28 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:39.562 23:51:28 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:39.562 23:51:28 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:39.562 23:51:28 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:39.562 23:51:28 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:39.562 23:51:28 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:39.562 23:51:28 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:39.562 23:51:28 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:39.562 23:51:28 -- setup/devices.sh@50 -- # local mount_point= 00:04:39.562 23:51:28 -- setup/devices.sh@51 -- # local test_file= 00:04:39.562 23:51:28 -- setup/devices.sh@53 -- # local found=0 00:04:39.562 23:51:28 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:39.562 23:51:28 -- setup/devices.sh@59 -- # local pci status 00:04:39.562 23:51:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.562 23:51:28 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:39.562 23:51:28 -- setup/devices.sh@47 -- # setup output config 00:04:39.562 23:51:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.562 23:51:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:42.851 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.851 23:51:31 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:42.851 23:51:31 -- setup/devices.sh@63 -- # found=1 00:04:42.851 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.851 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:42.852 23:51:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.852 23:51:32 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:42.852 23:51:32 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:42.852 23:51:32 -- setup/devices.sh@68 -- # return 0 00:04:42.852 23:51:32 -- setup/devices.sh@187 -- # cleanup_dm 00:04:42.852 23:51:32 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:42.852 23:51:32 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:42.852 23:51:32 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:42.852 23:51:32 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:42.852 23:51:32 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:42.852 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:42.852 23:51:32 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:42.852 23:51:32 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:42.852 00:04:42.852 real 0m9.858s 00:04:42.852 user 0m2.415s 00:04:42.852 sys 0m4.521s 00:04:42.852 23:51:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:42.852 23:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:42.852 ************************************ 00:04:42.852 END TEST dm_mount 00:04:42.852 ************************************ 00:04:42.852 23:51:32 -- setup/devices.sh@1 -- # cleanup 00:04:42.852 23:51:32 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:42.852 23:51:32 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.852 23:51:32 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:42.852 23:51:32 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:42.852 23:51:32 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:42.852 23:51:32 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:43.111 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:43.111 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:43.111 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:43.111 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:43.111 23:51:32 -- setup/devices.sh@12 -- # cleanup_dm 00:04:43.111 23:51:32 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:43.111 23:51:32 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:43.111 23:51:32 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:43.111 23:51:32 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:43.111 23:51:32 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:43.111 23:51:32 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:43.111 00:04:43.111 real 0m26.712s 00:04:43.111 user 0m7.662s 00:04:43.111 sys 0m13.945s 00:04:43.111 23:51:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.111 23:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:43.111 ************************************ 00:04:43.111 END TEST devices 00:04:43.111 ************************************ 00:04:43.111 00:04:43.111 real 1m32.903s 00:04:43.111 user 0m29.005s 00:04:43.111 sys 0m52.876s 00:04:43.111 23:51:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.111 23:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:43.111 ************************************ 00:04:43.111 END TEST setup.sh 00:04:43.111 ************************************ 00:04:43.111 23:51:32 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:46.397 Hugepages 00:04:46.397 node hugesize free / total 00:04:46.397 node0 1048576kB 0 / 0 00:04:46.397 node0 2048kB 2048 / 2048 00:04:46.397 node1 1048576kB 0 / 0 00:04:46.397 node1 2048kB 0 / 0 00:04:46.397 00:04:46.397 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:46.397 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:46.397 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:46.397 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:46.397 23:51:35 -- spdk/autotest.sh@141 -- # uname -s 00:04:46.397 23:51:35 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:04:46.397 23:51:35 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:04:46.397 23:51:35 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:49.687 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:49.687 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:51.066 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:51.326 23:51:40 -- common/autotest_common.sh@1517 -- # sleep 1 00:04:52.262 23:51:41 -- common/autotest_common.sh@1518 -- # bdfs=() 00:04:52.262 23:51:41 -- common/autotest_common.sh@1518 -- # local bdfs 00:04:52.262 23:51:41 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:04:52.262 23:51:41 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:04:52.262 23:51:41 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:52.262 23:51:41 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:52.262 23:51:41 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:52.262 23:51:41 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:52.262 23:51:41 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:52.262 23:51:41 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:52.262 23:51:41 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:52.262 23:51:41 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:55.567 Waiting for block devices as requested 00:04:55.567 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:55.567 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:55.567 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:55.826 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:55.826 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:55.826 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:56.084 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:56.084 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:56.084 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:56.084 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:56.345 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:56.345 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:56.345 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:56.603 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:56.603 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:56.603 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:56.862 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:56.862 23:51:46 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:04:56.862 23:51:46 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:04:56.862 23:51:46 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:56.862 23:51:46 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:04:56.862 23:51:46 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1530 -- # grep oacs 00:04:56.862 23:51:46 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:04:56.862 23:51:46 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:04:56.862 23:51:46 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:04:56.862 23:51:46 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:04:56.862 23:51:46 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:04:56.862 23:51:46 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:04:56.862 23:51:46 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:04:56.862 23:51:46 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:04:56.862 23:51:46 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:04:56.862 23:51:46 -- common/autotest_common.sh@1542 -- # continue 00:04:56.862 23:51:46 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:04:56.862 23:51:46 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:56.862 23:51:46 -- common/autotest_common.sh@10 -- # set +x 00:04:57.121 23:51:46 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:04:57.121 23:51:46 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:57.121 23:51:46 -- common/autotest_common.sh@10 -- # set +x 00:04:57.121 23:51:46 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:00.407 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:00.407 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:00.666 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:00.666 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:00.666 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:00.666 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:00.666 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:02.044 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:02.304 23:51:51 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:02.304 23:51:51 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:02.304 23:51:51 -- common/autotest_common.sh@10 -- # set +x 00:05:02.304 23:51:51 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:02.304 23:51:51 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:02.304 23:51:51 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:02.304 23:51:51 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:02.304 23:51:51 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:02.304 23:51:51 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:02.304 23:51:51 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:02.304 23:51:51 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:02.304 23:51:51 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:02.304 23:51:51 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:02.304 23:51:51 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:02.304 23:51:51 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:02.304 23:51:51 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:02.564 23:51:51 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:02.564 23:51:51 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:02.564 23:51:51 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:02.564 23:51:51 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:02.564 23:51:51 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:02.564 23:51:51 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:02.564 23:51:51 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:02.564 23:51:51 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=446654 00:05:02.564 23:51:51 -- common/autotest_common.sh@1583 -- # waitforlisten 446654 00:05:02.564 23:51:51 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:02.564 23:51:51 -- common/autotest_common.sh@819 -- # '[' -z 446654 ']' 00:05:02.564 23:51:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:02.564 23:51:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:02.564 23:51:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:02.564 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:02.564 23:51:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:02.564 23:51:51 -- common/autotest_common.sh@10 -- # set +x 00:05:02.564 [2024-04-25 23:51:51.950196] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:02.565 [2024-04-25 23:51:51.950279] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446654 ] 00:05:02.565 EAL: No free 2048 kB hugepages reported on node 1 00:05:02.565 [2024-04-25 23:51:52.021656] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:02.565 [2024-04-25 23:51:52.058744] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:02.565 [2024-04-25 23:51:52.058860] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:03.501 23:51:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:03.501 23:51:52 -- common/autotest_common.sh@852 -- # return 0 00:05:03.501 23:51:52 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:03.501 23:51:52 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:03.501 23:51:52 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:06.791 nvme0n1 00:05:06.791 23:51:55 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:06.791 [2024-04-25 23:51:55.896306] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:06.791 request: 00:05:06.791 { 00:05:06.791 "nvme_ctrlr_name": "nvme0", 00:05:06.791 "password": "test", 00:05:06.791 "method": "bdev_nvme_opal_revert", 00:05:06.791 "req_id": 1 00:05:06.791 } 00:05:06.791 Got JSON-RPC error response 00:05:06.791 response: 00:05:06.791 { 00:05:06.791 "code": -32602, 00:05:06.791 "message": "Invalid parameters" 00:05:06.791 } 00:05:06.791 23:51:55 -- common/autotest_common.sh@1589 -- # true 00:05:06.791 23:51:55 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:06.791 23:51:55 -- common/autotest_common.sh@1593 -- # killprocess 446654 00:05:06.791 23:51:55 -- common/autotest_common.sh@926 -- # '[' -z 446654 ']' 00:05:06.791 23:51:55 -- common/autotest_common.sh@930 -- # kill -0 446654 00:05:06.791 23:51:55 -- common/autotest_common.sh@931 -- # uname 00:05:06.791 23:51:55 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:06.791 23:51:55 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 446654 00:05:06.791 23:51:55 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:06.791 23:51:55 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:06.791 23:51:55 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 446654' 00:05:06.791 killing process with pid 446654 00:05:06.791 23:51:55 -- common/autotest_common.sh@945 -- # kill 446654 00:05:06.791 23:51:55 -- common/autotest_common.sh@950 -- # wait 446654 00:05:08.699 23:51:58 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:08.699 23:51:58 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:08.699 23:51:58 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:08.699 23:51:58 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:08.699 23:51:58 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:08.699 23:51:58 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:08.699 23:51:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.699 23:51:58 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:08.699 23:51:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:08.699 23:51:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:08.699 23:51:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.699 ************************************ 00:05:08.699 START TEST env 00:05:08.699 ************************************ 00:05:08.699 23:51:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:08.699 * Looking for test storage... 00:05:08.699 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:08.699 23:51:58 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:08.699 23:51:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:08.699 23:51:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:08.699 23:51:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.699 ************************************ 00:05:08.699 START TEST env_memory 00:05:08.699 ************************************ 00:05:08.699 23:51:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:08.699 00:05:08.699 00:05:08.699 CUnit - A unit testing framework for C - Version 2.1-3 00:05:08.699 http://cunit.sourceforge.net/ 00:05:08.699 00:05:08.699 00:05:08.699 Suite: memory 00:05:08.699 Test: alloc and free memory map ...[2024-04-25 23:51:58.264184] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:08.699 passed 00:05:08.699 Test: mem map translation ...[2024-04-25 23:51:58.277818] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:08.699 [2024-04-25 23:51:58.277834] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:08.699 [2024-04-25 23:51:58.277866] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:08.699 [2024-04-25 23:51:58.277875] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:08.699 passed 00:05:08.699 Test: mem map registration ...[2024-04-25 23:51:58.300071] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:08.699 [2024-04-25 23:51:58.300087] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:08.699 passed 00:05:08.959 Test: mem map adjacent registrations ...passed 00:05:08.959 00:05:08.959 Run Summary: Type Total Ran Passed Failed Inactive 00:05:08.959 suites 1 1 n/a 0 0 00:05:08.959 tests 4 4 4 0 0 00:05:08.959 asserts 152 152 152 0 n/a 00:05:08.959 00:05:08.959 Elapsed time = 0.089 seconds 00:05:08.959 00:05:08.959 real 0m0.102s 00:05:08.959 user 0m0.089s 00:05:08.959 sys 0m0.013s 00:05:08.959 23:51:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.959 23:51:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.959 ************************************ 00:05:08.959 END TEST env_memory 00:05:08.959 ************************************ 00:05:08.959 23:51:58 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:08.959 23:51:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:08.959 23:51:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:08.959 23:51:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.959 ************************************ 00:05:08.959 START TEST env_vtophys 00:05:08.959 ************************************ 00:05:08.959 23:51:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:08.959 EAL: lib.eal log level changed from notice to debug 00:05:08.959 EAL: Detected lcore 0 as core 0 on socket 0 00:05:08.959 EAL: Detected lcore 1 as core 1 on socket 0 00:05:08.959 EAL: Detected lcore 2 as core 2 on socket 0 00:05:08.959 EAL: Detected lcore 3 as core 3 on socket 0 00:05:08.959 EAL: Detected lcore 4 as core 4 on socket 0 00:05:08.959 EAL: Detected lcore 5 as core 5 on socket 0 00:05:08.959 EAL: Detected lcore 6 as core 6 on socket 0 00:05:08.959 EAL: Detected lcore 7 as core 8 on socket 0 00:05:08.959 EAL: Detected lcore 8 as core 9 on socket 0 00:05:08.959 EAL: Detected lcore 9 as core 10 on socket 0 00:05:08.959 EAL: Detected lcore 10 as core 11 on socket 0 00:05:08.959 EAL: Detected lcore 11 as core 12 on socket 0 00:05:08.959 EAL: Detected lcore 12 as core 13 on socket 0 00:05:08.959 EAL: Detected lcore 13 as core 14 on socket 0 00:05:08.959 EAL: Detected lcore 14 as core 16 on socket 0 00:05:08.959 EAL: Detected lcore 15 as core 17 on socket 0 00:05:08.959 EAL: Detected lcore 16 as core 18 on socket 0 00:05:08.959 EAL: Detected lcore 17 as core 19 on socket 0 00:05:08.959 EAL: Detected lcore 18 as core 20 on socket 0 00:05:08.959 EAL: Detected lcore 19 as core 21 on socket 0 00:05:08.959 EAL: Detected lcore 20 as core 22 on socket 0 00:05:08.959 EAL: Detected lcore 21 as core 24 on socket 0 00:05:08.959 EAL: Detected lcore 22 as core 25 on socket 0 00:05:08.959 EAL: Detected lcore 23 as core 26 on socket 0 00:05:08.959 EAL: Detected lcore 24 as core 27 on socket 0 00:05:08.959 EAL: Detected lcore 25 as core 28 on socket 0 00:05:08.959 EAL: Detected lcore 26 as core 29 on socket 0 00:05:08.959 EAL: Detected lcore 27 as core 30 on socket 0 00:05:08.959 EAL: Detected lcore 28 as core 0 on socket 1 00:05:08.959 EAL: Detected lcore 29 as core 1 on socket 1 00:05:08.959 EAL: Detected lcore 30 as core 2 on socket 1 00:05:08.959 EAL: Detected lcore 31 as core 3 on socket 1 00:05:08.959 EAL: Detected lcore 32 as core 4 on socket 1 00:05:08.959 EAL: Detected lcore 33 as core 5 on socket 1 00:05:08.959 EAL: Detected lcore 34 as core 6 on socket 1 00:05:08.959 EAL: Detected lcore 35 as core 8 on socket 1 00:05:08.959 EAL: Detected lcore 36 as core 9 on socket 1 00:05:08.959 EAL: Detected lcore 37 as core 10 on socket 1 00:05:08.959 EAL: Detected lcore 38 as core 11 on socket 1 00:05:08.959 EAL: Detected lcore 39 as core 12 on socket 1 00:05:08.959 EAL: Detected lcore 40 as core 13 on socket 1 00:05:08.959 EAL: Detected lcore 41 as core 14 on socket 1 00:05:08.959 EAL: Detected lcore 42 as core 16 on socket 1 00:05:08.959 EAL: Detected lcore 43 as core 17 on socket 1 00:05:08.959 EAL: Detected lcore 44 as core 18 on socket 1 00:05:08.959 EAL: Detected lcore 45 as core 19 on socket 1 00:05:08.959 EAL: Detected lcore 46 as core 20 on socket 1 00:05:08.959 EAL: Detected lcore 47 as core 21 on socket 1 00:05:08.959 EAL: Detected lcore 48 as core 22 on socket 1 00:05:08.959 EAL: Detected lcore 49 as core 24 on socket 1 00:05:08.959 EAL: Detected lcore 50 as core 25 on socket 1 00:05:08.959 EAL: Detected lcore 51 as core 26 on socket 1 00:05:08.959 EAL: Detected lcore 52 as core 27 on socket 1 00:05:08.959 EAL: Detected lcore 53 as core 28 on socket 1 00:05:08.959 EAL: Detected lcore 54 as core 29 on socket 1 00:05:08.959 EAL: Detected lcore 55 as core 30 on socket 1 00:05:08.959 EAL: Detected lcore 56 as core 0 on socket 0 00:05:08.959 EAL: Detected lcore 57 as core 1 on socket 0 00:05:08.959 EAL: Detected lcore 58 as core 2 on socket 0 00:05:08.959 EAL: Detected lcore 59 as core 3 on socket 0 00:05:08.959 EAL: Detected lcore 60 as core 4 on socket 0 00:05:08.959 EAL: Detected lcore 61 as core 5 on socket 0 00:05:08.959 EAL: Detected lcore 62 as core 6 on socket 0 00:05:08.959 EAL: Detected lcore 63 as core 8 on socket 0 00:05:08.959 EAL: Detected lcore 64 as core 9 on socket 0 00:05:08.959 EAL: Detected lcore 65 as core 10 on socket 0 00:05:08.959 EAL: Detected lcore 66 as core 11 on socket 0 00:05:08.959 EAL: Detected lcore 67 as core 12 on socket 0 00:05:08.959 EAL: Detected lcore 68 as core 13 on socket 0 00:05:08.960 EAL: Detected lcore 69 as core 14 on socket 0 00:05:08.960 EAL: Detected lcore 70 as core 16 on socket 0 00:05:08.960 EAL: Detected lcore 71 as core 17 on socket 0 00:05:08.960 EAL: Detected lcore 72 as core 18 on socket 0 00:05:08.960 EAL: Detected lcore 73 as core 19 on socket 0 00:05:08.960 EAL: Detected lcore 74 as core 20 on socket 0 00:05:08.960 EAL: Detected lcore 75 as core 21 on socket 0 00:05:08.960 EAL: Detected lcore 76 as core 22 on socket 0 00:05:08.960 EAL: Detected lcore 77 as core 24 on socket 0 00:05:08.960 EAL: Detected lcore 78 as core 25 on socket 0 00:05:08.960 EAL: Detected lcore 79 as core 26 on socket 0 00:05:08.960 EAL: Detected lcore 80 as core 27 on socket 0 00:05:08.960 EAL: Detected lcore 81 as core 28 on socket 0 00:05:08.960 EAL: Detected lcore 82 as core 29 on socket 0 00:05:08.960 EAL: Detected lcore 83 as core 30 on socket 0 00:05:08.960 EAL: Detected lcore 84 as core 0 on socket 1 00:05:08.960 EAL: Detected lcore 85 as core 1 on socket 1 00:05:08.960 EAL: Detected lcore 86 as core 2 on socket 1 00:05:08.960 EAL: Detected lcore 87 as core 3 on socket 1 00:05:08.960 EAL: Detected lcore 88 as core 4 on socket 1 00:05:08.960 EAL: Detected lcore 89 as core 5 on socket 1 00:05:08.960 EAL: Detected lcore 90 as core 6 on socket 1 00:05:08.960 EAL: Detected lcore 91 as core 8 on socket 1 00:05:08.960 EAL: Detected lcore 92 as core 9 on socket 1 00:05:08.960 EAL: Detected lcore 93 as core 10 on socket 1 00:05:08.960 EAL: Detected lcore 94 as core 11 on socket 1 00:05:08.960 EAL: Detected lcore 95 as core 12 on socket 1 00:05:08.960 EAL: Detected lcore 96 as core 13 on socket 1 00:05:08.960 EAL: Detected lcore 97 as core 14 on socket 1 00:05:08.960 EAL: Detected lcore 98 as core 16 on socket 1 00:05:08.960 EAL: Detected lcore 99 as core 17 on socket 1 00:05:08.960 EAL: Detected lcore 100 as core 18 on socket 1 00:05:08.960 EAL: Detected lcore 101 as core 19 on socket 1 00:05:08.960 EAL: Detected lcore 102 as core 20 on socket 1 00:05:08.960 EAL: Detected lcore 103 as core 21 on socket 1 00:05:08.960 EAL: Detected lcore 104 as core 22 on socket 1 00:05:08.960 EAL: Detected lcore 105 as core 24 on socket 1 00:05:08.960 EAL: Detected lcore 106 as core 25 on socket 1 00:05:08.960 EAL: Detected lcore 107 as core 26 on socket 1 00:05:08.960 EAL: Detected lcore 108 as core 27 on socket 1 00:05:08.960 EAL: Detected lcore 109 as core 28 on socket 1 00:05:08.960 EAL: Detected lcore 110 as core 29 on socket 1 00:05:08.960 EAL: Detected lcore 111 as core 30 on socket 1 00:05:08.960 EAL: Maximum logical cores by configuration: 128 00:05:08.960 EAL: Detected CPU lcores: 112 00:05:08.960 EAL: Detected NUMA nodes: 2 00:05:08.960 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:08.960 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:08.960 EAL: Checking presence of .so 'librte_eal.so' 00:05:08.960 EAL: Detected static linkage of DPDK 00:05:08.960 EAL: No shared files mode enabled, IPC will be disabled 00:05:08.960 EAL: Bus pci wants IOVA as 'DC' 00:05:08.960 EAL: Buses did not request a specific IOVA mode. 00:05:08.960 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:08.960 EAL: Selected IOVA mode 'VA' 00:05:08.960 EAL: No free 2048 kB hugepages reported on node 1 00:05:08.960 EAL: Probing VFIO support... 00:05:08.960 EAL: IOMMU type 1 (Type 1) is supported 00:05:08.960 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:08.960 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:08.960 EAL: VFIO support initialized 00:05:08.960 EAL: Ask a virtual area of 0x2e000 bytes 00:05:08.960 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:08.960 EAL: Setting up physically contiguous memory... 00:05:08.960 EAL: Setting maximum number of open files to 524288 00:05:08.960 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:08.960 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:08.960 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:08.960 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:08.960 EAL: Ask a virtual area of 0x61000 bytes 00:05:08.960 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:08.960 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:08.960 EAL: Ask a virtual area of 0x400000000 bytes 00:05:08.960 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:08.960 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:08.960 EAL: Hugepages will be freed exactly as allocated. 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: TSC frequency is ~2500000 KHz 00:05:08.960 EAL: Main lcore 0 is ready (tid=7efca3f5ca00;cpuset=[0]) 00:05:08.960 EAL: Trying to obtain current memory policy. 00:05:08.960 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.960 EAL: Restoring previous memory policy: 0 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was expanded by 2MB 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Mem event callback 'spdk:(nil)' registered 00:05:08.960 00:05:08.960 00:05:08.960 CUnit - A unit testing framework for C - Version 2.1-3 00:05:08.960 http://cunit.sourceforge.net/ 00:05:08.960 00:05:08.960 00:05:08.960 Suite: components_suite 00:05:08.960 Test: vtophys_malloc_test ...passed 00:05:08.960 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:08.960 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.960 EAL: Restoring previous memory policy: 4 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was expanded by 4MB 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was shrunk by 4MB 00:05:08.960 EAL: Trying to obtain current memory policy. 00:05:08.960 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.960 EAL: Restoring previous memory policy: 4 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was expanded by 6MB 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was shrunk by 6MB 00:05:08.960 EAL: Trying to obtain current memory policy. 00:05:08.960 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.960 EAL: Restoring previous memory policy: 4 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was expanded by 10MB 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.960 EAL: Heap on socket 0 was shrunk by 10MB 00:05:08.960 EAL: Trying to obtain current memory policy. 00:05:08.960 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.960 EAL: Restoring previous memory policy: 4 00:05:08.960 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.960 EAL: request: mp_malloc_sync 00:05:08.960 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was expanded by 18MB 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.961 EAL: request: mp_malloc_sync 00:05:08.961 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was shrunk by 18MB 00:05:08.961 EAL: Trying to obtain current memory policy. 00:05:08.961 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.961 EAL: Restoring previous memory policy: 4 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.961 EAL: request: mp_malloc_sync 00:05:08.961 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was expanded by 34MB 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.961 EAL: request: mp_malloc_sync 00:05:08.961 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was shrunk by 34MB 00:05:08.961 EAL: Trying to obtain current memory policy. 00:05:08.961 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.961 EAL: Restoring previous memory policy: 4 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.961 EAL: request: mp_malloc_sync 00:05:08.961 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was expanded by 66MB 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.961 EAL: request: mp_malloc_sync 00:05:08.961 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was shrunk by 66MB 00:05:08.961 EAL: Trying to obtain current memory policy. 00:05:08.961 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:08.961 EAL: Restoring previous memory policy: 4 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:08.961 EAL: request: mp_malloc_sync 00:05:08.961 EAL: No shared files mode enabled, IPC is disabled 00:05:08.961 EAL: Heap on socket 0 was expanded by 130MB 00:05:08.961 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.221 EAL: request: mp_malloc_sync 00:05:09.221 EAL: No shared files mode enabled, IPC is disabled 00:05:09.221 EAL: Heap on socket 0 was shrunk by 130MB 00:05:09.221 EAL: Trying to obtain current memory policy. 00:05:09.221 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:09.221 EAL: Restoring previous memory policy: 4 00:05:09.221 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.221 EAL: request: mp_malloc_sync 00:05:09.221 EAL: No shared files mode enabled, IPC is disabled 00:05:09.221 EAL: Heap on socket 0 was expanded by 258MB 00:05:09.221 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.221 EAL: request: mp_malloc_sync 00:05:09.221 EAL: No shared files mode enabled, IPC is disabled 00:05:09.221 EAL: Heap on socket 0 was shrunk by 258MB 00:05:09.221 EAL: Trying to obtain current memory policy. 00:05:09.221 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:09.221 EAL: Restoring previous memory policy: 4 00:05:09.221 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.221 EAL: request: mp_malloc_sync 00:05:09.221 EAL: No shared files mode enabled, IPC is disabled 00:05:09.221 EAL: Heap on socket 0 was expanded by 514MB 00:05:09.480 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.480 EAL: request: mp_malloc_sync 00:05:09.480 EAL: No shared files mode enabled, IPC is disabled 00:05:09.480 EAL: Heap on socket 0 was shrunk by 514MB 00:05:09.480 EAL: Trying to obtain current memory policy. 00:05:09.480 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:09.740 EAL: Restoring previous memory policy: 4 00:05:09.740 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.740 EAL: request: mp_malloc_sync 00:05:09.740 EAL: No shared files mode enabled, IPC is disabled 00:05:09.740 EAL: Heap on socket 0 was expanded by 1026MB 00:05:09.740 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.999 EAL: request: mp_malloc_sync 00:05:09.999 EAL: No shared files mode enabled, IPC is disabled 00:05:09.999 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:09.999 passed 00:05:09.999 00:05:09.999 Run Summary: Type Total Ran Passed Failed Inactive 00:05:09.999 suites 1 1 n/a 0 0 00:05:09.999 tests 2 2 2 0 0 00:05:09.999 asserts 497 497 497 0 n/a 00:05:09.999 00:05:09.999 Elapsed time = 0.963 seconds 00:05:09.999 EAL: Calling mem event callback 'spdk:(nil)' 00:05:09.999 EAL: request: mp_malloc_sync 00:05:09.999 EAL: No shared files mode enabled, IPC is disabled 00:05:09.999 EAL: Heap on socket 0 was shrunk by 2MB 00:05:09.999 EAL: No shared files mode enabled, IPC is disabled 00:05:09.999 EAL: No shared files mode enabled, IPC is disabled 00:05:09.999 EAL: No shared files mode enabled, IPC is disabled 00:05:09.999 00:05:09.999 real 0m1.085s 00:05:09.999 user 0m0.632s 00:05:09.999 sys 0m0.426s 00:05:09.999 23:51:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.999 23:51:59 -- common/autotest_common.sh@10 -- # set +x 00:05:09.999 ************************************ 00:05:09.999 END TEST env_vtophys 00:05:09.999 ************************************ 00:05:09.999 23:51:59 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:09.999 23:51:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.999 23:51:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.999 23:51:59 -- common/autotest_common.sh@10 -- # set +x 00:05:09.999 ************************************ 00:05:09.999 START TEST env_pci 00:05:09.999 ************************************ 00:05:09.999 23:51:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:09.999 00:05:09.999 00:05:09.999 CUnit - A unit testing framework for C - Version 2.1-3 00:05:09.999 http://cunit.sourceforge.net/ 00:05:09.999 00:05:09.999 00:05:09.999 Suite: pci 00:05:09.999 Test: pci_hook ...[2024-04-25 23:51:59.531327] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 448019 has claimed it 00:05:09.999 EAL: Cannot find device (10000:00:01.0) 00:05:09.999 EAL: Failed to attach device on primary process 00:05:09.999 passed 00:05:09.999 00:05:09.999 Run Summary: Type Total Ran Passed Failed Inactive 00:05:09.999 suites 1 1 n/a 0 0 00:05:09.999 tests 1 1 1 0 0 00:05:09.999 asserts 25 25 25 0 n/a 00:05:09.999 00:05:09.999 Elapsed time = 0.035 seconds 00:05:09.999 00:05:09.999 real 0m0.054s 00:05:10.000 user 0m0.011s 00:05:10.000 sys 0m0.042s 00:05:10.000 23:51:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.000 23:51:59 -- common/autotest_common.sh@10 -- # set +x 00:05:10.000 ************************************ 00:05:10.000 END TEST env_pci 00:05:10.000 ************************************ 00:05:10.259 23:51:59 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:10.259 23:51:59 -- env/env.sh@15 -- # uname 00:05:10.259 23:51:59 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:10.259 23:51:59 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:10.259 23:51:59 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:10.259 23:51:59 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:10.259 23:51:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:10.259 23:51:59 -- common/autotest_common.sh@10 -- # set +x 00:05:10.259 ************************************ 00:05:10.259 START TEST env_dpdk_post_init 00:05:10.259 ************************************ 00:05:10.259 23:51:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:10.259 EAL: Detected CPU lcores: 112 00:05:10.259 EAL: Detected NUMA nodes: 2 00:05:10.259 EAL: Detected static linkage of DPDK 00:05:10.259 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:10.259 EAL: Selected IOVA mode 'VA' 00:05:10.259 EAL: No free 2048 kB hugepages reported on node 1 00:05:10.259 EAL: VFIO support initialized 00:05:10.259 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:10.259 EAL: Using IOMMU type 1 (Type 1) 00:05:11.193 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:14.475 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:14.475 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:15.042 Starting DPDK initialization... 00:05:15.042 Starting SPDK post initialization... 00:05:15.042 SPDK NVMe probe 00:05:15.042 Attaching to 0000:d8:00.0 00:05:15.042 Attached to 0000:d8:00.0 00:05:15.042 Cleaning up... 00:05:15.042 00:05:15.042 real 0m4.747s 00:05:15.042 user 0m3.545s 00:05:15.042 sys 0m0.449s 00:05:15.042 23:52:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.042 23:52:04 -- common/autotest_common.sh@10 -- # set +x 00:05:15.042 ************************************ 00:05:15.042 END TEST env_dpdk_post_init 00:05:15.042 ************************************ 00:05:15.042 23:52:04 -- env/env.sh@26 -- # uname 00:05:15.042 23:52:04 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:15.042 23:52:04 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:15.042 23:52:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:15.042 23:52:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:15.042 23:52:04 -- common/autotest_common.sh@10 -- # set +x 00:05:15.042 ************************************ 00:05:15.042 START TEST env_mem_callbacks 00:05:15.042 ************************************ 00:05:15.042 23:52:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:15.042 EAL: Detected CPU lcores: 112 00:05:15.042 EAL: Detected NUMA nodes: 2 00:05:15.042 EAL: Detected static linkage of DPDK 00:05:15.042 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:15.042 EAL: Selected IOVA mode 'VA' 00:05:15.042 EAL: No free 2048 kB hugepages reported on node 1 00:05:15.042 EAL: VFIO support initialized 00:05:15.042 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:15.042 00:05:15.042 00:05:15.042 CUnit - A unit testing framework for C - Version 2.1-3 00:05:15.042 http://cunit.sourceforge.net/ 00:05:15.042 00:05:15.042 00:05:15.042 Suite: memory 00:05:15.042 Test: test ... 00:05:15.042 register 0x200000200000 2097152 00:05:15.042 malloc 3145728 00:05:15.042 register 0x200000400000 4194304 00:05:15.042 buf 0x200000500000 len 3145728 PASSED 00:05:15.042 malloc 64 00:05:15.042 buf 0x2000004fff40 len 64 PASSED 00:05:15.042 malloc 4194304 00:05:15.042 register 0x200000800000 6291456 00:05:15.042 buf 0x200000a00000 len 4194304 PASSED 00:05:15.042 free 0x200000500000 3145728 00:05:15.042 free 0x2000004fff40 64 00:05:15.042 unregister 0x200000400000 4194304 PASSED 00:05:15.042 free 0x200000a00000 4194304 00:05:15.042 unregister 0x200000800000 6291456 PASSED 00:05:15.042 malloc 8388608 00:05:15.042 register 0x200000400000 10485760 00:05:15.042 buf 0x200000600000 len 8388608 PASSED 00:05:15.042 free 0x200000600000 8388608 00:05:15.042 unregister 0x200000400000 10485760 PASSED 00:05:15.042 passed 00:05:15.042 00:05:15.042 Run Summary: Type Total Ran Passed Failed Inactive 00:05:15.042 suites 1 1 n/a 0 0 00:05:15.042 tests 1 1 1 0 0 00:05:15.042 asserts 15 15 15 0 n/a 00:05:15.042 00:05:15.042 Elapsed time = 0.005 seconds 00:05:15.042 00:05:15.042 real 0m0.064s 00:05:15.043 user 0m0.020s 00:05:15.043 sys 0m0.044s 00:05:15.043 23:52:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.043 23:52:04 -- common/autotest_common.sh@10 -- # set +x 00:05:15.043 ************************************ 00:05:15.043 END TEST env_mem_callbacks 00:05:15.043 ************************************ 00:05:15.043 00:05:15.043 real 0m6.411s 00:05:15.043 user 0m4.417s 00:05:15.043 sys 0m1.264s 00:05:15.043 23:52:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.043 23:52:04 -- common/autotest_common.sh@10 -- # set +x 00:05:15.043 ************************************ 00:05:15.043 END TEST env 00:05:15.043 ************************************ 00:05:15.043 23:52:04 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:15.043 23:52:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:15.043 23:52:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:15.043 23:52:04 -- common/autotest_common.sh@10 -- # set +x 00:05:15.043 ************************************ 00:05:15.043 START TEST rpc 00:05:15.043 ************************************ 00:05:15.043 23:52:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:15.301 * Looking for test storage... 00:05:15.301 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:15.301 23:52:04 -- rpc/rpc.sh@65 -- # spdk_pid=449115 00:05:15.301 23:52:04 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:15.301 23:52:04 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:15.301 23:52:04 -- rpc/rpc.sh@67 -- # waitforlisten 449115 00:05:15.301 23:52:04 -- common/autotest_common.sh@819 -- # '[' -z 449115 ']' 00:05:15.301 23:52:04 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:15.301 23:52:04 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:15.301 23:52:04 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:15.301 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:15.301 23:52:04 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:15.301 23:52:04 -- common/autotest_common.sh@10 -- # set +x 00:05:15.301 [2024-04-25 23:52:04.706259] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:15.301 [2024-04-25 23:52:04.706347] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449115 ] 00:05:15.301 EAL: No free 2048 kB hugepages reported on node 1 00:05:15.301 [2024-04-25 23:52:04.775955] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:15.301 [2024-04-25 23:52:04.813064] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:15.301 [2024-04-25 23:52:04.813187] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:15.301 [2024-04-25 23:52:04.813199] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 449115' to capture a snapshot of events at runtime. 00:05:15.301 [2024-04-25 23:52:04.813209] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid449115 for offline analysis/debug. 00:05:15.301 [2024-04-25 23:52:04.813232] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.237 23:52:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:16.237 23:52:05 -- common/autotest_common.sh@852 -- # return 0 00:05:16.237 23:52:05 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:16.237 23:52:05 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:16.237 23:52:05 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:16.237 23:52:05 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:16.237 23:52:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.237 23:52:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 ************************************ 00:05:16.237 START TEST rpc_integrity 00:05:16.237 ************************************ 00:05:16.237 23:52:05 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:16.237 23:52:05 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:16.237 23:52:05 -- rpc/rpc.sh@13 -- # jq length 00:05:16.237 23:52:05 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:16.237 23:52:05 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:16.237 23:52:05 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:16.237 { 00:05:16.237 "name": "Malloc0", 00:05:16.237 "aliases": [ 00:05:16.237 "cbfd8438-c761-43f0-b042-ba356784ccf7" 00:05:16.237 ], 00:05:16.237 "product_name": "Malloc disk", 00:05:16.237 "block_size": 512, 00:05:16.237 "num_blocks": 16384, 00:05:16.237 "uuid": "cbfd8438-c761-43f0-b042-ba356784ccf7", 00:05:16.237 "assigned_rate_limits": { 00:05:16.237 "rw_ios_per_sec": 0, 00:05:16.237 "rw_mbytes_per_sec": 0, 00:05:16.237 "r_mbytes_per_sec": 0, 00:05:16.237 "w_mbytes_per_sec": 0 00:05:16.237 }, 00:05:16.237 "claimed": false, 00:05:16.237 "zoned": false, 00:05:16.237 "supported_io_types": { 00:05:16.237 "read": true, 00:05:16.237 "write": true, 00:05:16.237 "unmap": true, 00:05:16.237 "write_zeroes": true, 00:05:16.237 "flush": true, 00:05:16.237 "reset": true, 00:05:16.237 "compare": false, 00:05:16.237 "compare_and_write": false, 00:05:16.237 "abort": true, 00:05:16.237 "nvme_admin": false, 00:05:16.237 "nvme_io": false 00:05:16.237 }, 00:05:16.237 "memory_domains": [ 00:05:16.237 { 00:05:16.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:16.237 "dma_device_type": 2 00:05:16.237 } 00:05:16.237 ], 00:05:16.237 "driver_specific": {} 00:05:16.237 } 00:05:16.237 ]' 00:05:16.237 23:52:05 -- rpc/rpc.sh@17 -- # jq length 00:05:16.237 23:52:05 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:16.237 23:52:05 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 [2024-04-25 23:52:05.660651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:16.237 [2024-04-25 23:52:05.660686] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:16.237 [2024-04-25 23:52:05.660703] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5bef090 00:05:16.237 [2024-04-25 23:52:05.660713] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:16.237 [2024-04-25 23:52:05.661599] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:16.237 [2024-04-25 23:52:05.661620] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:16.237 Passthru0 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:16.237 { 00:05:16.237 "name": "Malloc0", 00:05:16.237 "aliases": [ 00:05:16.237 "cbfd8438-c761-43f0-b042-ba356784ccf7" 00:05:16.237 ], 00:05:16.237 "product_name": "Malloc disk", 00:05:16.237 "block_size": 512, 00:05:16.237 "num_blocks": 16384, 00:05:16.237 "uuid": "cbfd8438-c761-43f0-b042-ba356784ccf7", 00:05:16.237 "assigned_rate_limits": { 00:05:16.237 "rw_ios_per_sec": 0, 00:05:16.237 "rw_mbytes_per_sec": 0, 00:05:16.237 "r_mbytes_per_sec": 0, 00:05:16.237 "w_mbytes_per_sec": 0 00:05:16.237 }, 00:05:16.237 "claimed": true, 00:05:16.237 "claim_type": "exclusive_write", 00:05:16.237 "zoned": false, 00:05:16.237 "supported_io_types": { 00:05:16.237 "read": true, 00:05:16.237 "write": true, 00:05:16.237 "unmap": true, 00:05:16.237 "write_zeroes": true, 00:05:16.237 "flush": true, 00:05:16.237 "reset": true, 00:05:16.237 "compare": false, 00:05:16.237 "compare_and_write": false, 00:05:16.237 "abort": true, 00:05:16.237 "nvme_admin": false, 00:05:16.237 "nvme_io": false 00:05:16.237 }, 00:05:16.237 "memory_domains": [ 00:05:16.237 { 00:05:16.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:16.237 "dma_device_type": 2 00:05:16.237 } 00:05:16.237 ], 00:05:16.237 "driver_specific": {} 00:05:16.237 }, 00:05:16.237 { 00:05:16.237 "name": "Passthru0", 00:05:16.237 "aliases": [ 00:05:16.237 "4640647a-3656-553b-82dd-88a464477cc4" 00:05:16.237 ], 00:05:16.237 "product_name": "passthru", 00:05:16.237 "block_size": 512, 00:05:16.237 "num_blocks": 16384, 00:05:16.237 "uuid": "4640647a-3656-553b-82dd-88a464477cc4", 00:05:16.237 "assigned_rate_limits": { 00:05:16.237 "rw_ios_per_sec": 0, 00:05:16.237 "rw_mbytes_per_sec": 0, 00:05:16.237 "r_mbytes_per_sec": 0, 00:05:16.237 "w_mbytes_per_sec": 0 00:05:16.237 }, 00:05:16.237 "claimed": false, 00:05:16.237 "zoned": false, 00:05:16.237 "supported_io_types": { 00:05:16.237 "read": true, 00:05:16.237 "write": true, 00:05:16.237 "unmap": true, 00:05:16.237 "write_zeroes": true, 00:05:16.237 "flush": true, 00:05:16.237 "reset": true, 00:05:16.237 "compare": false, 00:05:16.237 "compare_and_write": false, 00:05:16.237 "abort": true, 00:05:16.237 "nvme_admin": false, 00:05:16.237 "nvme_io": false 00:05:16.237 }, 00:05:16.237 "memory_domains": [ 00:05:16.237 { 00:05:16.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:16.237 "dma_device_type": 2 00:05:16.237 } 00:05:16.237 ], 00:05:16.237 "driver_specific": { 00:05:16.237 "passthru": { 00:05:16.237 "name": "Passthru0", 00:05:16.237 "base_bdev_name": "Malloc0" 00:05:16.237 } 00:05:16.237 } 00:05:16.237 } 00:05:16.237 ]' 00:05:16.237 23:52:05 -- rpc/rpc.sh@21 -- # jq length 00:05:16.237 23:52:05 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:16.237 23:52:05 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:16.237 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.237 23:52:05 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:16.237 23:52:05 -- rpc/rpc.sh@26 -- # jq length 00:05:16.237 23:52:05 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:16.237 00:05:16.237 real 0m0.280s 00:05:16.237 user 0m0.175s 00:05:16.237 sys 0m0.043s 00:05:16.237 23:52:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:16.237 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.237 ************************************ 00:05:16.237 END TEST rpc_integrity 00:05:16.237 ************************************ 00:05:16.237 23:52:05 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:16.238 23:52:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.238 23:52:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.238 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 ************************************ 00:05:16.496 START TEST rpc_plugins 00:05:16.496 ************************************ 00:05:16.496 23:52:05 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:16.496 23:52:05 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:16.496 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.496 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.496 23:52:05 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:16.496 23:52:05 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:16.496 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.496 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.496 23:52:05 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:16.496 { 00:05:16.496 "name": "Malloc1", 00:05:16.496 "aliases": [ 00:05:16.496 "e67502ff-d5fc-45cd-acef-6c2f39c36b50" 00:05:16.496 ], 00:05:16.496 "product_name": "Malloc disk", 00:05:16.496 "block_size": 4096, 00:05:16.496 "num_blocks": 256, 00:05:16.496 "uuid": "e67502ff-d5fc-45cd-acef-6c2f39c36b50", 00:05:16.496 "assigned_rate_limits": { 00:05:16.496 "rw_ios_per_sec": 0, 00:05:16.496 "rw_mbytes_per_sec": 0, 00:05:16.496 "r_mbytes_per_sec": 0, 00:05:16.496 "w_mbytes_per_sec": 0 00:05:16.496 }, 00:05:16.496 "claimed": false, 00:05:16.496 "zoned": false, 00:05:16.496 "supported_io_types": { 00:05:16.496 "read": true, 00:05:16.496 "write": true, 00:05:16.496 "unmap": true, 00:05:16.496 "write_zeroes": true, 00:05:16.496 "flush": true, 00:05:16.496 "reset": true, 00:05:16.496 "compare": false, 00:05:16.496 "compare_and_write": false, 00:05:16.496 "abort": true, 00:05:16.496 "nvme_admin": false, 00:05:16.496 "nvme_io": false 00:05:16.496 }, 00:05:16.496 "memory_domains": [ 00:05:16.496 { 00:05:16.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:16.496 "dma_device_type": 2 00:05:16.496 } 00:05:16.496 ], 00:05:16.496 "driver_specific": {} 00:05:16.496 } 00:05:16.496 ]' 00:05:16.496 23:52:05 -- rpc/rpc.sh@32 -- # jq length 00:05:16.496 23:52:05 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:16.496 23:52:05 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:16.496 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.496 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.496 23:52:05 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:16.496 23:52:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.496 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 23:52:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.496 23:52:05 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:16.496 23:52:05 -- rpc/rpc.sh@36 -- # jq length 00:05:16.496 23:52:05 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:16.496 00:05:16.496 real 0m0.135s 00:05:16.496 user 0m0.082s 00:05:16.496 sys 0m0.021s 00:05:16.496 23:52:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:16.496 23:52:05 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 ************************************ 00:05:16.496 END TEST rpc_plugins 00:05:16.496 ************************************ 00:05:16.496 23:52:06 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:16.496 23:52:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.496 23:52:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.496 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:16.496 ************************************ 00:05:16.496 START TEST rpc_trace_cmd_test 00:05:16.497 ************************************ 00:05:16.497 23:52:06 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:16.497 23:52:06 -- rpc/rpc.sh@40 -- # local info 00:05:16.497 23:52:06 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:16.497 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.497 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:16.497 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.497 23:52:06 -- rpc/rpc.sh@42 -- # info='{ 00:05:16.497 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid449115", 00:05:16.497 "tpoint_group_mask": "0x8", 00:05:16.497 "iscsi_conn": { 00:05:16.497 "mask": "0x2", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "scsi": { 00:05:16.497 "mask": "0x4", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "bdev": { 00:05:16.497 "mask": "0x8", 00:05:16.497 "tpoint_mask": "0xffffffffffffffff" 00:05:16.497 }, 00:05:16.497 "nvmf_rdma": { 00:05:16.497 "mask": "0x10", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "nvmf_tcp": { 00:05:16.497 "mask": "0x20", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "ftl": { 00:05:16.497 "mask": "0x40", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "blobfs": { 00:05:16.497 "mask": "0x80", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "dsa": { 00:05:16.497 "mask": "0x200", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "thread": { 00:05:16.497 "mask": "0x400", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "nvme_pcie": { 00:05:16.497 "mask": "0x800", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "iaa": { 00:05:16.497 "mask": "0x1000", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "nvme_tcp": { 00:05:16.497 "mask": "0x2000", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 }, 00:05:16.497 "bdev_nvme": { 00:05:16.497 "mask": "0x4000", 00:05:16.497 "tpoint_mask": "0x0" 00:05:16.497 } 00:05:16.497 }' 00:05:16.497 23:52:06 -- rpc/rpc.sh@43 -- # jq length 00:05:16.497 23:52:06 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:16.755 23:52:06 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:16.755 23:52:06 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:16.755 23:52:06 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:16.755 23:52:06 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:16.755 23:52:06 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:16.755 23:52:06 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:16.755 23:52:06 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:16.755 23:52:06 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:16.755 00:05:16.755 real 0m0.236s 00:05:16.755 user 0m0.190s 00:05:16.755 sys 0m0.037s 00:05:16.755 23:52:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:16.755 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:16.755 ************************************ 00:05:16.755 END TEST rpc_trace_cmd_test 00:05:16.755 ************************************ 00:05:16.755 23:52:06 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:16.755 23:52:06 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:16.755 23:52:06 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:16.755 23:52:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.755 23:52:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.755 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:16.755 ************************************ 00:05:16.755 START TEST rpc_daemon_integrity 00:05:16.755 ************************************ 00:05:16.755 23:52:06 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:16.755 23:52:06 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:16.755 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.755 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:16.755 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.755 23:52:06 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:16.755 23:52:06 -- rpc/rpc.sh@13 -- # jq length 00:05:17.014 23:52:06 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:17.014 23:52:06 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:17.014 23:52:06 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:17.014 { 00:05:17.014 "name": "Malloc2", 00:05:17.014 "aliases": [ 00:05:17.014 "aedbb153-4018-40ce-9f0e-9bc62d52fb61" 00:05:17.014 ], 00:05:17.014 "product_name": "Malloc disk", 00:05:17.014 "block_size": 512, 00:05:17.014 "num_blocks": 16384, 00:05:17.014 "uuid": "aedbb153-4018-40ce-9f0e-9bc62d52fb61", 00:05:17.014 "assigned_rate_limits": { 00:05:17.014 "rw_ios_per_sec": 0, 00:05:17.014 "rw_mbytes_per_sec": 0, 00:05:17.014 "r_mbytes_per_sec": 0, 00:05:17.014 "w_mbytes_per_sec": 0 00:05:17.014 }, 00:05:17.014 "claimed": false, 00:05:17.014 "zoned": false, 00:05:17.014 "supported_io_types": { 00:05:17.014 "read": true, 00:05:17.014 "write": true, 00:05:17.014 "unmap": true, 00:05:17.014 "write_zeroes": true, 00:05:17.014 "flush": true, 00:05:17.014 "reset": true, 00:05:17.014 "compare": false, 00:05:17.014 "compare_and_write": false, 00:05:17.014 "abort": true, 00:05:17.014 "nvme_admin": false, 00:05:17.014 "nvme_io": false 00:05:17.014 }, 00:05:17.014 "memory_domains": [ 00:05:17.014 { 00:05:17.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.014 "dma_device_type": 2 00:05:17.014 } 00:05:17.014 ], 00:05:17.014 "driver_specific": {} 00:05:17.014 } 00:05:17.014 ]' 00:05:17.014 23:52:06 -- rpc/rpc.sh@17 -- # jq length 00:05:17.014 23:52:06 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:17.014 23:52:06 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 [2024-04-25 23:52:06.458694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:17.014 [2024-04-25 23:52:06.458723] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:17.014 [2024-04-25 23:52:06.458742] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5d78980 00:05:17.014 [2024-04-25 23:52:06.458752] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:17.014 [2024-04-25 23:52:06.459451] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:17.014 [2024-04-25 23:52:06.459471] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:17.014 Passthru0 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:17.014 { 00:05:17.014 "name": "Malloc2", 00:05:17.014 "aliases": [ 00:05:17.014 "aedbb153-4018-40ce-9f0e-9bc62d52fb61" 00:05:17.014 ], 00:05:17.014 "product_name": "Malloc disk", 00:05:17.014 "block_size": 512, 00:05:17.014 "num_blocks": 16384, 00:05:17.014 "uuid": "aedbb153-4018-40ce-9f0e-9bc62d52fb61", 00:05:17.014 "assigned_rate_limits": { 00:05:17.014 "rw_ios_per_sec": 0, 00:05:17.014 "rw_mbytes_per_sec": 0, 00:05:17.014 "r_mbytes_per_sec": 0, 00:05:17.014 "w_mbytes_per_sec": 0 00:05:17.014 }, 00:05:17.014 "claimed": true, 00:05:17.014 "claim_type": "exclusive_write", 00:05:17.014 "zoned": false, 00:05:17.014 "supported_io_types": { 00:05:17.014 "read": true, 00:05:17.014 "write": true, 00:05:17.014 "unmap": true, 00:05:17.014 "write_zeroes": true, 00:05:17.014 "flush": true, 00:05:17.014 "reset": true, 00:05:17.014 "compare": false, 00:05:17.014 "compare_and_write": false, 00:05:17.014 "abort": true, 00:05:17.014 "nvme_admin": false, 00:05:17.014 "nvme_io": false 00:05:17.014 }, 00:05:17.014 "memory_domains": [ 00:05:17.014 { 00:05:17.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.014 "dma_device_type": 2 00:05:17.014 } 00:05:17.014 ], 00:05:17.014 "driver_specific": {} 00:05:17.014 }, 00:05:17.014 { 00:05:17.014 "name": "Passthru0", 00:05:17.014 "aliases": [ 00:05:17.014 "3838cf91-304c-5656-99d4-f6d0eb214c52" 00:05:17.014 ], 00:05:17.014 "product_name": "passthru", 00:05:17.014 "block_size": 512, 00:05:17.014 "num_blocks": 16384, 00:05:17.014 "uuid": "3838cf91-304c-5656-99d4-f6d0eb214c52", 00:05:17.014 "assigned_rate_limits": { 00:05:17.014 "rw_ios_per_sec": 0, 00:05:17.014 "rw_mbytes_per_sec": 0, 00:05:17.014 "r_mbytes_per_sec": 0, 00:05:17.014 "w_mbytes_per_sec": 0 00:05:17.014 }, 00:05:17.014 "claimed": false, 00:05:17.014 "zoned": false, 00:05:17.014 "supported_io_types": { 00:05:17.014 "read": true, 00:05:17.014 "write": true, 00:05:17.014 "unmap": true, 00:05:17.014 "write_zeroes": true, 00:05:17.014 "flush": true, 00:05:17.014 "reset": true, 00:05:17.014 "compare": false, 00:05:17.014 "compare_and_write": false, 00:05:17.014 "abort": true, 00:05:17.014 "nvme_admin": false, 00:05:17.014 "nvme_io": false 00:05:17.014 }, 00:05:17.014 "memory_domains": [ 00:05:17.014 { 00:05:17.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.014 "dma_device_type": 2 00:05:17.014 } 00:05:17.014 ], 00:05:17.014 "driver_specific": { 00:05:17.014 "passthru": { 00:05:17.014 "name": "Passthru0", 00:05:17.014 "base_bdev_name": "Malloc2" 00:05:17.014 } 00:05:17.014 } 00:05:17.014 } 00:05:17.014 ]' 00:05:17.014 23:52:06 -- rpc/rpc.sh@21 -- # jq length 00:05:17.014 23:52:06 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:17.014 23:52:06 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:17.014 23:52:06 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 23:52:06 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:17.014 23:52:06 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:17.014 23:52:06 -- rpc/rpc.sh@26 -- # jq length 00:05:17.014 23:52:06 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:17.014 00:05:17.014 real 0m0.284s 00:05:17.014 user 0m0.176s 00:05:17.014 sys 0m0.044s 00:05:17.014 23:52:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.014 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.014 ************************************ 00:05:17.014 END TEST rpc_daemon_integrity 00:05:17.014 ************************************ 00:05:17.273 23:52:06 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:17.273 23:52:06 -- rpc/rpc.sh@84 -- # killprocess 449115 00:05:17.273 23:52:06 -- common/autotest_common.sh@926 -- # '[' -z 449115 ']' 00:05:17.273 23:52:06 -- common/autotest_common.sh@930 -- # kill -0 449115 00:05:17.273 23:52:06 -- common/autotest_common.sh@931 -- # uname 00:05:17.273 23:52:06 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:17.273 23:52:06 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 449115 00:05:17.273 23:52:06 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:17.273 23:52:06 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:17.273 23:52:06 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 449115' 00:05:17.273 killing process with pid 449115 00:05:17.273 23:52:06 -- common/autotest_common.sh@945 -- # kill 449115 00:05:17.273 23:52:06 -- common/autotest_common.sh@950 -- # wait 449115 00:05:17.531 00:05:17.531 real 0m2.409s 00:05:17.531 user 0m3.063s 00:05:17.531 sys 0m0.707s 00:05:17.531 23:52:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.531 23:52:06 -- common/autotest_common.sh@10 -- # set +x 00:05:17.531 ************************************ 00:05:17.531 END TEST rpc 00:05:17.531 ************************************ 00:05:17.531 23:52:07 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:17.531 23:52:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:17.531 23:52:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.531 23:52:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.531 ************************************ 00:05:17.531 START TEST rpc_client 00:05:17.531 ************************************ 00:05:17.531 23:52:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:17.531 * Looking for test storage... 00:05:17.531 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:17.531 23:52:07 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:17.531 OK 00:05:17.531 23:52:07 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:17.531 00:05:17.531 real 0m0.100s 00:05:17.531 user 0m0.038s 00:05:17.531 sys 0m0.066s 00:05:17.531 23:52:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.531 23:52:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.531 ************************************ 00:05:17.531 END TEST rpc_client 00:05:17.531 ************************************ 00:05:17.790 23:52:07 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:17.790 23:52:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:17.790 23:52:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.790 23:52:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.790 ************************************ 00:05:17.790 START TEST json_config 00:05:17.790 ************************************ 00:05:17.790 23:52:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:17.790 23:52:07 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:17.790 23:52:07 -- nvmf/common.sh@7 -- # uname -s 00:05:17.790 23:52:07 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:17.790 23:52:07 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:17.790 23:52:07 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:17.790 23:52:07 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:17.790 23:52:07 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:17.790 23:52:07 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:17.790 23:52:07 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:17.790 23:52:07 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:17.790 23:52:07 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:17.790 23:52:07 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:17.790 23:52:07 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:17.790 23:52:07 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:17.790 23:52:07 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:17.790 23:52:07 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:17.790 23:52:07 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:17.790 23:52:07 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:17.790 23:52:07 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:17.790 23:52:07 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:17.790 23:52:07 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:17.790 23:52:07 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:17.790 23:52:07 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:17.790 23:52:07 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:17.790 23:52:07 -- paths/export.sh@5 -- # export PATH 00:05:17.790 23:52:07 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:17.790 23:52:07 -- nvmf/common.sh@46 -- # : 0 00:05:17.790 23:52:07 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:17.790 23:52:07 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:17.790 23:52:07 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:17.790 23:52:07 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:17.790 23:52:07 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:17.790 23:52:07 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:17.790 23:52:07 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:17.790 23:52:07 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:17.790 23:52:07 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:17.790 23:52:07 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:17.790 23:52:07 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:17.790 23:52:07 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:17.790 23:52:07 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:17.790 WARNING: No tests are enabled so not running JSON configuration tests 00:05:17.790 23:52:07 -- json_config/json_config.sh@27 -- # exit 0 00:05:17.790 00:05:17.790 real 0m0.095s 00:05:17.790 user 0m0.044s 00:05:17.790 sys 0m0.053s 00:05:17.791 23:52:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:17.791 23:52:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.791 ************************************ 00:05:17.791 END TEST json_config 00:05:17.791 ************************************ 00:05:17.791 23:52:07 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:17.791 23:52:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:17.791 23:52:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.791 23:52:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.791 ************************************ 00:05:17.791 START TEST json_config_extra_key 00:05:17.791 ************************************ 00:05:17.791 23:52:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:17.791 23:52:07 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:17.791 23:52:07 -- nvmf/common.sh@7 -- # uname -s 00:05:17.791 23:52:07 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:17.791 23:52:07 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:17.791 23:52:07 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:17.791 23:52:07 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:17.791 23:52:07 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:17.791 23:52:07 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:17.791 23:52:07 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:17.791 23:52:07 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:18.050 23:52:07 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:18.050 23:52:07 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:18.050 23:52:07 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:18.050 23:52:07 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:18.050 23:52:07 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:18.050 23:52:07 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:18.050 23:52:07 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:18.050 23:52:07 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:18.050 23:52:07 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:18.050 23:52:07 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:18.050 23:52:07 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:18.050 23:52:07 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:18.050 23:52:07 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:18.050 23:52:07 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:18.050 23:52:07 -- paths/export.sh@5 -- # export PATH 00:05:18.050 23:52:07 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:18.050 23:52:07 -- nvmf/common.sh@46 -- # : 0 00:05:18.050 23:52:07 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:18.050 23:52:07 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:18.050 23:52:07 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:18.050 23:52:07 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:18.050 23:52:07 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:18.050 23:52:07 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:18.050 23:52:07 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:18.050 23:52:07 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:18.050 INFO: launching applications... 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=449814 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:18.050 Waiting for target to run... 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 449814 /var/tmp/spdk_tgt.sock 00:05:18.050 23:52:07 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:18.050 23:52:07 -- common/autotest_common.sh@819 -- # '[' -z 449814 ']' 00:05:18.050 23:52:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:18.050 23:52:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:18.050 23:52:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:18.050 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:18.050 23:52:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:18.050 23:52:07 -- common/autotest_common.sh@10 -- # set +x 00:05:18.050 [2024-04-25 23:52:07.449473] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:18.050 [2024-04-25 23:52:07.449539] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449814 ] 00:05:18.050 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.309 [2024-04-25 23:52:07.730645] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.309 [2024-04-25 23:52:07.750076] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:18.309 [2024-04-25 23:52:07.750191] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.874 23:52:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:18.874 23:52:08 -- common/autotest_common.sh@852 -- # return 0 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:18.874 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:18.874 INFO: shutting down applications... 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 449814 ]] 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 449814 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@50 -- # kill -0 449814 00:05:18.874 23:52:08 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@50 -- # kill -0 449814 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:19.441 SPDK target shutdown done 00:05:19.441 23:52:08 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:19.441 Success 00:05:19.441 00:05:19.441 real 0m1.450s 00:05:19.441 user 0m1.174s 00:05:19.441 sys 0m0.396s 00:05:19.441 23:52:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:19.441 23:52:08 -- common/autotest_common.sh@10 -- # set +x 00:05:19.441 ************************************ 00:05:19.441 END TEST json_config_extra_key 00:05:19.441 ************************************ 00:05:19.441 23:52:08 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:19.441 23:52:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:19.441 23:52:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:19.441 23:52:08 -- common/autotest_common.sh@10 -- # set +x 00:05:19.441 ************************************ 00:05:19.441 START TEST alias_rpc 00:05:19.441 ************************************ 00:05:19.441 23:52:08 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:19.441 * Looking for test storage... 00:05:19.441 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:19.442 23:52:08 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:19.442 23:52:08 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=450093 00:05:19.442 23:52:08 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 450093 00:05:19.442 23:52:08 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:19.442 23:52:08 -- common/autotest_common.sh@819 -- # '[' -z 450093 ']' 00:05:19.442 23:52:08 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:19.442 23:52:08 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:19.442 23:52:08 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:19.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:19.442 23:52:08 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:19.442 23:52:08 -- common/autotest_common.sh@10 -- # set +x 00:05:19.442 [2024-04-25 23:52:08.936070] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:19.442 [2024-04-25 23:52:08.936163] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450093 ] 00:05:19.442 EAL: No free 2048 kB hugepages reported on node 1 00:05:19.442 [2024-04-25 23:52:09.005895] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:19.442 [2024-04-25 23:52:09.042038] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:19.442 [2024-04-25 23:52:09.042172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:20.376 23:52:09 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:20.376 23:52:09 -- common/autotest_common.sh@852 -- # return 0 00:05:20.376 23:52:09 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:20.376 23:52:09 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 450093 00:05:20.376 23:52:09 -- common/autotest_common.sh@926 -- # '[' -z 450093 ']' 00:05:20.376 23:52:09 -- common/autotest_common.sh@930 -- # kill -0 450093 00:05:20.376 23:52:09 -- common/autotest_common.sh@931 -- # uname 00:05:20.376 23:52:09 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:20.376 23:52:09 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 450093 00:05:20.376 23:52:09 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:20.634 23:52:09 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:20.634 23:52:09 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 450093' 00:05:20.634 killing process with pid 450093 00:05:20.634 23:52:09 -- common/autotest_common.sh@945 -- # kill 450093 00:05:20.634 23:52:09 -- common/autotest_common.sh@950 -- # wait 450093 00:05:20.892 00:05:20.892 real 0m1.463s 00:05:20.892 user 0m1.557s 00:05:20.892 sys 0m0.436s 00:05:20.892 23:52:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.892 23:52:10 -- common/autotest_common.sh@10 -- # set +x 00:05:20.892 ************************************ 00:05:20.892 END TEST alias_rpc 00:05:20.892 ************************************ 00:05:20.892 23:52:10 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:20.892 23:52:10 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:20.892 23:52:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:20.892 23:52:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:20.892 23:52:10 -- common/autotest_common.sh@10 -- # set +x 00:05:20.892 ************************************ 00:05:20.892 START TEST spdkcli_tcp 00:05:20.892 ************************************ 00:05:20.892 23:52:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:20.892 * Looking for test storage... 00:05:20.892 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:20.892 23:52:10 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:20.892 23:52:10 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:20.892 23:52:10 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:20.892 23:52:10 -- common/autotest_common.sh@10 -- # set +x 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=450397 00:05:20.892 23:52:10 -- spdkcli/tcp.sh@27 -- # waitforlisten 450397 00:05:20.892 23:52:10 -- common/autotest_common.sh@819 -- # '[' -z 450397 ']' 00:05:20.892 23:52:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:20.893 23:52:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:20.893 23:52:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:20.893 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:20.893 23:52:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:20.893 23:52:10 -- common/autotest_common.sh@10 -- # set +x 00:05:20.893 23:52:10 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:20.893 [2024-04-25 23:52:10.426572] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:20.893 [2024-04-25 23:52:10.426646] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450397 ] 00:05:20.893 EAL: No free 2048 kB hugepages reported on node 1 00:05:20.893 [2024-04-25 23:52:10.495099] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:21.151 [2024-04-25 23:52:10.532756] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:21.151 [2024-04-25 23:52:10.532900] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:21.151 [2024-04-25 23:52:10.532902] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:21.717 23:52:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:21.717 23:52:11 -- common/autotest_common.sh@852 -- # return 0 00:05:21.717 23:52:11 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:21.717 23:52:11 -- spdkcli/tcp.sh@31 -- # socat_pid=450539 00:05:21.717 23:52:11 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:21.976 [ 00:05:21.976 "spdk_get_version", 00:05:21.976 "rpc_get_methods", 00:05:21.976 "trace_get_info", 00:05:21.976 "trace_get_tpoint_group_mask", 00:05:21.976 "trace_disable_tpoint_group", 00:05:21.976 "trace_enable_tpoint_group", 00:05:21.976 "trace_clear_tpoint_mask", 00:05:21.976 "trace_set_tpoint_mask", 00:05:21.976 "vfu_tgt_set_base_path", 00:05:21.976 "framework_get_pci_devices", 00:05:21.976 "framework_get_config", 00:05:21.976 "framework_get_subsystems", 00:05:21.976 "iobuf_get_stats", 00:05:21.976 "iobuf_set_options", 00:05:21.976 "sock_set_default_impl", 00:05:21.976 "sock_impl_set_options", 00:05:21.976 "sock_impl_get_options", 00:05:21.976 "vmd_rescan", 00:05:21.976 "vmd_remove_device", 00:05:21.976 "vmd_enable", 00:05:21.976 "accel_get_stats", 00:05:21.976 "accel_set_options", 00:05:21.976 "accel_set_driver", 00:05:21.976 "accel_crypto_key_destroy", 00:05:21.976 "accel_crypto_keys_get", 00:05:21.976 "accel_crypto_key_create", 00:05:21.976 "accel_assign_opc", 00:05:21.976 "accel_get_module_info", 00:05:21.976 "accel_get_opc_assignments", 00:05:21.976 "notify_get_notifications", 00:05:21.976 "notify_get_types", 00:05:21.976 "bdev_get_histogram", 00:05:21.976 "bdev_enable_histogram", 00:05:21.976 "bdev_set_qos_limit", 00:05:21.976 "bdev_set_qd_sampling_period", 00:05:21.976 "bdev_get_bdevs", 00:05:21.976 "bdev_reset_iostat", 00:05:21.976 "bdev_get_iostat", 00:05:21.976 "bdev_examine", 00:05:21.976 "bdev_wait_for_examine", 00:05:21.976 "bdev_set_options", 00:05:21.976 "scsi_get_devices", 00:05:21.976 "thread_set_cpumask", 00:05:21.976 "framework_get_scheduler", 00:05:21.976 "framework_set_scheduler", 00:05:21.976 "framework_get_reactors", 00:05:21.976 "thread_get_io_channels", 00:05:21.976 "thread_get_pollers", 00:05:21.976 "thread_get_stats", 00:05:21.976 "framework_monitor_context_switch", 00:05:21.976 "spdk_kill_instance", 00:05:21.976 "log_enable_timestamps", 00:05:21.976 "log_get_flags", 00:05:21.976 "log_clear_flag", 00:05:21.976 "log_set_flag", 00:05:21.976 "log_get_level", 00:05:21.976 "log_set_level", 00:05:21.976 "log_get_print_level", 00:05:21.976 "log_set_print_level", 00:05:21.976 "framework_enable_cpumask_locks", 00:05:21.976 "framework_disable_cpumask_locks", 00:05:21.976 "framework_wait_init", 00:05:21.976 "framework_start_init", 00:05:21.976 "virtio_blk_create_transport", 00:05:21.976 "virtio_blk_get_transports", 00:05:21.976 "vhost_controller_set_coalescing", 00:05:21.976 "vhost_get_controllers", 00:05:21.976 "vhost_delete_controller", 00:05:21.976 "vhost_create_blk_controller", 00:05:21.976 "vhost_scsi_controller_remove_target", 00:05:21.976 "vhost_scsi_controller_add_target", 00:05:21.976 "vhost_start_scsi_controller", 00:05:21.976 "vhost_create_scsi_controller", 00:05:21.976 "ublk_recover_disk", 00:05:21.976 "ublk_get_disks", 00:05:21.976 "ublk_stop_disk", 00:05:21.976 "ublk_start_disk", 00:05:21.976 "ublk_destroy_target", 00:05:21.976 "ublk_create_target", 00:05:21.976 "nbd_get_disks", 00:05:21.976 "nbd_stop_disk", 00:05:21.976 "nbd_start_disk", 00:05:21.976 "env_dpdk_get_mem_stats", 00:05:21.976 "nvmf_subsystem_get_listeners", 00:05:21.976 "nvmf_subsystem_get_qpairs", 00:05:21.976 "nvmf_subsystem_get_controllers", 00:05:21.976 "nvmf_get_stats", 00:05:21.976 "nvmf_get_transports", 00:05:21.976 "nvmf_create_transport", 00:05:21.976 "nvmf_get_targets", 00:05:21.976 "nvmf_delete_target", 00:05:21.976 "nvmf_create_target", 00:05:21.976 "nvmf_subsystem_allow_any_host", 00:05:21.976 "nvmf_subsystem_remove_host", 00:05:21.976 "nvmf_subsystem_add_host", 00:05:21.976 "nvmf_subsystem_remove_ns", 00:05:21.976 "nvmf_subsystem_add_ns", 00:05:21.976 "nvmf_subsystem_listener_set_ana_state", 00:05:21.976 "nvmf_discovery_get_referrals", 00:05:21.976 "nvmf_discovery_remove_referral", 00:05:21.976 "nvmf_discovery_add_referral", 00:05:21.976 "nvmf_subsystem_remove_listener", 00:05:21.976 "nvmf_subsystem_add_listener", 00:05:21.976 "nvmf_delete_subsystem", 00:05:21.976 "nvmf_create_subsystem", 00:05:21.976 "nvmf_get_subsystems", 00:05:21.976 "nvmf_set_crdt", 00:05:21.976 "nvmf_set_config", 00:05:21.976 "nvmf_set_max_subsystems", 00:05:21.976 "iscsi_set_options", 00:05:21.976 "iscsi_get_auth_groups", 00:05:21.976 "iscsi_auth_group_remove_secret", 00:05:21.976 "iscsi_auth_group_add_secret", 00:05:21.976 "iscsi_delete_auth_group", 00:05:21.976 "iscsi_create_auth_group", 00:05:21.976 "iscsi_set_discovery_auth", 00:05:21.976 "iscsi_get_options", 00:05:21.976 "iscsi_target_node_request_logout", 00:05:21.976 "iscsi_target_node_set_redirect", 00:05:21.976 "iscsi_target_node_set_auth", 00:05:21.976 "iscsi_target_node_add_lun", 00:05:21.976 "iscsi_get_connections", 00:05:21.976 "iscsi_portal_group_set_auth", 00:05:21.976 "iscsi_start_portal_group", 00:05:21.976 "iscsi_delete_portal_group", 00:05:21.976 "iscsi_create_portal_group", 00:05:21.976 "iscsi_get_portal_groups", 00:05:21.976 "iscsi_delete_target_node", 00:05:21.976 "iscsi_target_node_remove_pg_ig_maps", 00:05:21.976 "iscsi_target_node_add_pg_ig_maps", 00:05:21.976 "iscsi_create_target_node", 00:05:21.976 "iscsi_get_target_nodes", 00:05:21.976 "iscsi_delete_initiator_group", 00:05:21.976 "iscsi_initiator_group_remove_initiators", 00:05:21.976 "iscsi_initiator_group_add_initiators", 00:05:21.976 "iscsi_create_initiator_group", 00:05:21.976 "iscsi_get_initiator_groups", 00:05:21.976 "vfu_virtio_create_scsi_endpoint", 00:05:21.976 "vfu_virtio_scsi_remove_target", 00:05:21.976 "vfu_virtio_scsi_add_target", 00:05:21.976 "vfu_virtio_create_blk_endpoint", 00:05:21.976 "vfu_virtio_delete_endpoint", 00:05:21.976 "iaa_scan_accel_module", 00:05:21.976 "dsa_scan_accel_module", 00:05:21.976 "ioat_scan_accel_module", 00:05:21.976 "accel_error_inject_error", 00:05:21.976 "bdev_iscsi_delete", 00:05:21.976 "bdev_iscsi_create", 00:05:21.976 "bdev_iscsi_set_options", 00:05:21.976 "bdev_virtio_attach_controller", 00:05:21.976 "bdev_virtio_scsi_get_devices", 00:05:21.976 "bdev_virtio_detach_controller", 00:05:21.976 "bdev_virtio_blk_set_hotplug", 00:05:21.976 "bdev_ftl_set_property", 00:05:21.976 "bdev_ftl_get_properties", 00:05:21.976 "bdev_ftl_get_stats", 00:05:21.976 "bdev_ftl_unmap", 00:05:21.976 "bdev_ftl_unload", 00:05:21.976 "bdev_ftl_delete", 00:05:21.976 "bdev_ftl_load", 00:05:21.976 "bdev_ftl_create", 00:05:21.976 "bdev_aio_delete", 00:05:21.976 "bdev_aio_rescan", 00:05:21.976 "bdev_aio_create", 00:05:21.976 "blobfs_create", 00:05:21.976 "blobfs_detect", 00:05:21.976 "blobfs_set_cache_size", 00:05:21.976 "bdev_zone_block_delete", 00:05:21.976 "bdev_zone_block_create", 00:05:21.976 "bdev_delay_delete", 00:05:21.976 "bdev_delay_create", 00:05:21.976 "bdev_delay_update_latency", 00:05:21.976 "bdev_split_delete", 00:05:21.976 "bdev_split_create", 00:05:21.976 "bdev_error_inject_error", 00:05:21.976 "bdev_error_delete", 00:05:21.976 "bdev_error_create", 00:05:21.976 "bdev_raid_set_options", 00:05:21.976 "bdev_raid_remove_base_bdev", 00:05:21.976 "bdev_raid_add_base_bdev", 00:05:21.976 "bdev_raid_delete", 00:05:21.976 "bdev_raid_create", 00:05:21.976 "bdev_raid_get_bdevs", 00:05:21.976 "bdev_lvol_grow_lvstore", 00:05:21.976 "bdev_lvol_get_lvols", 00:05:21.976 "bdev_lvol_get_lvstores", 00:05:21.976 "bdev_lvol_delete", 00:05:21.976 "bdev_lvol_set_read_only", 00:05:21.976 "bdev_lvol_resize", 00:05:21.976 "bdev_lvol_decouple_parent", 00:05:21.976 "bdev_lvol_inflate", 00:05:21.976 "bdev_lvol_rename", 00:05:21.976 "bdev_lvol_clone_bdev", 00:05:21.976 "bdev_lvol_clone", 00:05:21.976 "bdev_lvol_snapshot", 00:05:21.976 "bdev_lvol_create", 00:05:21.976 "bdev_lvol_delete_lvstore", 00:05:21.976 "bdev_lvol_rename_lvstore", 00:05:21.976 "bdev_lvol_create_lvstore", 00:05:21.976 "bdev_passthru_delete", 00:05:21.976 "bdev_passthru_create", 00:05:21.976 "bdev_nvme_cuse_unregister", 00:05:21.976 "bdev_nvme_cuse_register", 00:05:21.976 "bdev_opal_new_user", 00:05:21.976 "bdev_opal_set_lock_state", 00:05:21.976 "bdev_opal_delete", 00:05:21.976 "bdev_opal_get_info", 00:05:21.976 "bdev_opal_create", 00:05:21.976 "bdev_nvme_opal_revert", 00:05:21.976 "bdev_nvme_opal_init", 00:05:21.976 "bdev_nvme_send_cmd", 00:05:21.977 "bdev_nvme_get_path_iostat", 00:05:21.977 "bdev_nvme_get_mdns_discovery_info", 00:05:21.977 "bdev_nvme_stop_mdns_discovery", 00:05:21.977 "bdev_nvme_start_mdns_discovery", 00:05:21.977 "bdev_nvme_set_multipath_policy", 00:05:21.977 "bdev_nvme_set_preferred_path", 00:05:21.977 "bdev_nvme_get_io_paths", 00:05:21.977 "bdev_nvme_remove_error_injection", 00:05:21.977 "bdev_nvme_add_error_injection", 00:05:21.977 "bdev_nvme_get_discovery_info", 00:05:21.977 "bdev_nvme_stop_discovery", 00:05:21.977 "bdev_nvme_start_discovery", 00:05:21.977 "bdev_nvme_get_controller_health_info", 00:05:21.977 "bdev_nvme_disable_controller", 00:05:21.977 "bdev_nvme_enable_controller", 00:05:21.977 "bdev_nvme_reset_controller", 00:05:21.977 "bdev_nvme_get_transport_statistics", 00:05:21.977 "bdev_nvme_apply_firmware", 00:05:21.977 "bdev_nvme_detach_controller", 00:05:21.977 "bdev_nvme_get_controllers", 00:05:21.977 "bdev_nvme_attach_controller", 00:05:21.977 "bdev_nvme_set_hotplug", 00:05:21.977 "bdev_nvme_set_options", 00:05:21.977 "bdev_null_resize", 00:05:21.977 "bdev_null_delete", 00:05:21.977 "bdev_null_create", 00:05:21.977 "bdev_malloc_delete", 00:05:21.977 "bdev_malloc_create" 00:05:21.977 ] 00:05:21.977 23:52:11 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:21.977 23:52:11 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:21.977 23:52:11 -- common/autotest_common.sh@10 -- # set +x 00:05:21.977 23:52:11 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:21.977 23:52:11 -- spdkcli/tcp.sh@38 -- # killprocess 450397 00:05:21.977 23:52:11 -- common/autotest_common.sh@926 -- # '[' -z 450397 ']' 00:05:21.977 23:52:11 -- common/autotest_common.sh@930 -- # kill -0 450397 00:05:21.977 23:52:11 -- common/autotest_common.sh@931 -- # uname 00:05:21.977 23:52:11 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:21.977 23:52:11 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 450397 00:05:21.977 23:52:11 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:21.977 23:52:11 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:21.977 23:52:11 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 450397' 00:05:21.977 killing process with pid 450397 00:05:21.977 23:52:11 -- common/autotest_common.sh@945 -- # kill 450397 00:05:21.977 23:52:11 -- common/autotest_common.sh@950 -- # wait 450397 00:05:22.235 00:05:22.235 real 0m1.462s 00:05:22.235 user 0m2.771s 00:05:22.235 sys 0m0.459s 00:05:22.235 23:52:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:22.235 23:52:11 -- common/autotest_common.sh@10 -- # set +x 00:05:22.235 ************************************ 00:05:22.235 END TEST spdkcli_tcp 00:05:22.235 ************************************ 00:05:22.235 23:52:11 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:22.235 23:52:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:22.235 23:52:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:22.235 23:52:11 -- common/autotest_common.sh@10 -- # set +x 00:05:22.235 ************************************ 00:05:22.235 START TEST dpdk_mem_utility 00:05:22.235 ************************************ 00:05:22.235 23:52:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:22.493 * Looking for test storage... 00:05:22.493 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:22.493 23:52:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:22.493 23:52:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=450730 00:05:22.493 23:52:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:22.493 23:52:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 450730 00:05:22.493 23:52:11 -- common/autotest_common.sh@819 -- # '[' -z 450730 ']' 00:05:22.493 23:52:11 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:22.493 23:52:11 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:22.493 23:52:11 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:22.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:22.493 23:52:11 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:22.493 23:52:11 -- common/autotest_common.sh@10 -- # set +x 00:05:22.493 [2024-04-25 23:52:11.960410] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:22.493 [2024-04-25 23:52:11.960498] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450730 ] 00:05:22.493 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.493 [2024-04-25 23:52:12.027068] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.493 [2024-04-25 23:52:12.063517] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:22.493 [2024-04-25 23:52:12.063633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.428 23:52:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:23.428 23:52:12 -- common/autotest_common.sh@852 -- # return 0 00:05:23.428 23:52:12 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:23.428 23:52:12 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:23.428 23:52:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:23.428 23:52:12 -- common/autotest_common.sh@10 -- # set +x 00:05:23.428 { 00:05:23.428 "filename": "/tmp/spdk_mem_dump.txt" 00:05:23.428 } 00:05:23.428 23:52:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:23.428 23:52:12 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:23.428 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:23.428 1 heaps totaling size 814.000000 MiB 00:05:23.428 size: 814.000000 MiB heap id: 0 00:05:23.428 end heaps---------- 00:05:23.428 8 mempools totaling size 598.116089 MiB 00:05:23.428 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:23.428 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:23.428 size: 84.521057 MiB name: bdev_io_450730 00:05:23.428 size: 51.011292 MiB name: evtpool_450730 00:05:23.428 size: 50.003479 MiB name: msgpool_450730 00:05:23.428 size: 21.763794 MiB name: PDU_Pool 00:05:23.428 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:23.428 size: 0.026123 MiB name: Session_Pool 00:05:23.428 end mempools------- 00:05:23.428 6 memzones totaling size 4.142822 MiB 00:05:23.428 size: 1.000366 MiB name: RG_ring_0_450730 00:05:23.428 size: 1.000366 MiB name: RG_ring_1_450730 00:05:23.428 size: 1.000366 MiB name: RG_ring_4_450730 00:05:23.428 size: 1.000366 MiB name: RG_ring_5_450730 00:05:23.428 size: 0.125366 MiB name: RG_ring_2_450730 00:05:23.428 size: 0.015991 MiB name: RG_ring_3_450730 00:05:23.428 end memzones------- 00:05:23.428 23:52:12 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:23.428 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:23.428 list of free elements. size: 12.519348 MiB 00:05:23.428 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:23.428 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:23.428 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:23.428 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:23.428 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:23.428 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:23.428 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:23.428 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:23.428 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:23.428 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:23.428 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:23.428 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:23.428 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:23.428 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:23.428 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:23.428 list of standard malloc elements. size: 199.218079 MiB 00:05:23.428 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:23.428 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:23.428 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:23.428 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:23.428 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:23.428 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:23.428 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:23.428 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:23.428 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:23.428 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:23.428 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:23.428 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:23.428 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:23.428 list of memzone associated elements. size: 602.262573 MiB 00:05:23.428 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:23.428 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:23.428 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:23.428 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:23.428 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:23.428 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_450730_0 00:05:23.428 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:23.428 associated memzone info: size: 48.002930 MiB name: MP_evtpool_450730_0 00:05:23.428 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:23.428 associated memzone info: size: 48.002930 MiB name: MP_msgpool_450730_0 00:05:23.428 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:23.428 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:23.428 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:23.428 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:23.428 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:23.428 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_450730 00:05:23.428 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:23.428 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_450730 00:05:23.428 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:23.428 associated memzone info: size: 1.007996 MiB name: MP_evtpool_450730 00:05:23.428 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:23.428 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:23.428 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:23.428 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:23.428 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:23.428 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:23.428 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:23.428 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:23.428 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:23.428 associated memzone info: size: 1.000366 MiB name: RG_ring_0_450730 00:05:23.428 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:23.428 associated memzone info: size: 1.000366 MiB name: RG_ring_1_450730 00:05:23.429 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:23.429 associated memzone info: size: 1.000366 MiB name: RG_ring_4_450730 00:05:23.429 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:23.429 associated memzone info: size: 1.000366 MiB name: RG_ring_5_450730 00:05:23.429 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:23.429 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_450730 00:05:23.429 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:23.429 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:23.429 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:23.429 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:23.429 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:23.429 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:23.429 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:23.429 associated memzone info: size: 0.125366 MiB name: RG_ring_2_450730 00:05:23.429 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:23.429 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:23.429 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:23.429 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:23.429 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:23.429 associated memzone info: size: 0.015991 MiB name: RG_ring_3_450730 00:05:23.429 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:23.429 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:23.429 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:23.429 associated memzone info: size: 0.000183 MiB name: MP_msgpool_450730 00:05:23.429 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:23.429 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_450730 00:05:23.429 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:23.429 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:23.429 23:52:12 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:23.429 23:52:12 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 450730 00:05:23.429 23:52:12 -- common/autotest_common.sh@926 -- # '[' -z 450730 ']' 00:05:23.429 23:52:12 -- common/autotest_common.sh@930 -- # kill -0 450730 00:05:23.429 23:52:12 -- common/autotest_common.sh@931 -- # uname 00:05:23.429 23:52:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:23.429 23:52:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 450730 00:05:23.429 23:52:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:23.429 23:52:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:23.429 23:52:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 450730' 00:05:23.429 killing process with pid 450730 00:05:23.429 23:52:12 -- common/autotest_common.sh@945 -- # kill 450730 00:05:23.429 23:52:12 -- common/autotest_common.sh@950 -- # wait 450730 00:05:23.687 00:05:23.687 real 0m1.385s 00:05:23.687 user 0m1.425s 00:05:23.687 sys 0m0.427s 00:05:23.687 23:52:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:23.687 23:52:13 -- common/autotest_common.sh@10 -- # set +x 00:05:23.687 ************************************ 00:05:23.687 END TEST dpdk_mem_utility 00:05:23.687 ************************************ 00:05:23.687 23:52:13 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:23.687 23:52:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:23.687 23:52:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:23.687 23:52:13 -- common/autotest_common.sh@10 -- # set +x 00:05:23.687 ************************************ 00:05:23.687 START TEST event 00:05:23.687 ************************************ 00:05:23.687 23:52:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:23.946 * Looking for test storage... 00:05:23.946 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:23.946 23:52:13 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:23.946 23:52:13 -- bdev/nbd_common.sh@6 -- # set -e 00:05:23.946 23:52:13 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:23.946 23:52:13 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:23.946 23:52:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:23.946 23:52:13 -- common/autotest_common.sh@10 -- # set +x 00:05:23.946 ************************************ 00:05:23.946 START TEST event_perf 00:05:23.946 ************************************ 00:05:23.946 23:52:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:23.946 Running I/O for 1 seconds...[2024-04-25 23:52:13.390884] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:23.946 [2024-04-25 23:52:13.390971] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451013 ] 00:05:23.946 EAL: No free 2048 kB hugepages reported on node 1 00:05:23.946 [2024-04-25 23:52:13.462967] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:23.946 [2024-04-25 23:52:13.501195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:23.946 [2024-04-25 23:52:13.501292] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:23.946 [2024-04-25 23:52:13.501380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:23.946 [2024-04-25 23:52:13.501381] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:25.324 Running I/O for 1 seconds... 00:05:25.324 lcore 0: 188986 00:05:25.324 lcore 1: 188983 00:05:25.324 lcore 2: 188982 00:05:25.324 lcore 3: 188984 00:05:25.324 done. 00:05:25.324 00:05:25.324 real 0m1.183s 00:05:25.324 user 0m4.086s 00:05:25.324 sys 0m0.096s 00:05:25.324 23:52:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.324 23:52:14 -- common/autotest_common.sh@10 -- # set +x 00:05:25.324 ************************************ 00:05:25.324 END TEST event_perf 00:05:25.324 ************************************ 00:05:25.324 23:52:14 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:25.324 23:52:14 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:25.324 23:52:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:25.324 23:52:14 -- common/autotest_common.sh@10 -- # set +x 00:05:25.324 ************************************ 00:05:25.324 START TEST event_reactor 00:05:25.324 ************************************ 00:05:25.324 23:52:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:25.324 [2024-04-25 23:52:14.623263] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:25.324 [2024-04-25 23:52:14.623352] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451229 ] 00:05:25.324 EAL: No free 2048 kB hugepages reported on node 1 00:05:25.324 [2024-04-25 23:52:14.693258] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.324 [2024-04-25 23:52:14.727862] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.261 test_start 00:05:26.261 oneshot 00:05:26.261 tick 100 00:05:26.261 tick 100 00:05:26.261 tick 250 00:05:26.261 tick 100 00:05:26.261 tick 100 00:05:26.261 tick 100 00:05:26.261 tick 250 00:05:26.261 tick 500 00:05:26.261 tick 100 00:05:26.261 tick 100 00:05:26.261 tick 250 00:05:26.261 tick 100 00:05:26.261 tick 100 00:05:26.261 test_end 00:05:26.261 00:05:26.261 real 0m1.177s 00:05:26.261 user 0m1.087s 00:05:26.261 sys 0m0.086s 00:05:26.261 23:52:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:26.261 23:52:15 -- common/autotest_common.sh@10 -- # set +x 00:05:26.261 ************************************ 00:05:26.261 END TEST event_reactor 00:05:26.261 ************************************ 00:05:26.261 23:52:15 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:26.261 23:52:15 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:26.261 23:52:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:26.261 23:52:15 -- common/autotest_common.sh@10 -- # set +x 00:05:26.261 ************************************ 00:05:26.261 START TEST event_reactor_perf 00:05:26.261 ************************************ 00:05:26.261 23:52:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:26.261 [2024-04-25 23:52:15.846872] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:26.261 [2024-04-25 23:52:15.846977] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451512 ] 00:05:26.519 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.520 [2024-04-25 23:52:15.917526] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:26.520 [2024-04-25 23:52:15.951781] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.455 test_start 00:05:27.455 test_end 00:05:27.455 Performance: 925007 events per second 00:05:27.455 00:05:27.455 real 0m1.176s 00:05:27.455 user 0m1.084s 00:05:27.455 sys 0m0.088s 00:05:27.455 23:52:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.455 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.455 ************************************ 00:05:27.455 END TEST event_reactor_perf 00:05:27.455 ************************************ 00:05:27.455 23:52:17 -- event/event.sh@49 -- # uname -s 00:05:27.455 23:52:17 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:27.455 23:52:17 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:27.455 23:52:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:27.455 23:52:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:27.455 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.455 ************************************ 00:05:27.455 START TEST event_scheduler 00:05:27.456 ************************************ 00:05:27.456 23:52:17 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:27.715 * Looking for test storage... 00:05:27.715 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:27.715 23:52:17 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:27.715 23:52:17 -- scheduler/scheduler.sh@35 -- # scheduler_pid=451824 00:05:27.715 23:52:17 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:27.715 23:52:17 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:27.715 23:52:17 -- scheduler/scheduler.sh@37 -- # waitforlisten 451824 00:05:27.715 23:52:17 -- common/autotest_common.sh@819 -- # '[' -z 451824 ']' 00:05:27.715 23:52:17 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:27.715 23:52:17 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:27.715 23:52:17 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:27.715 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:27.715 23:52:17 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:27.715 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.715 [2024-04-25 23:52:17.166378] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:27.715 [2024-04-25 23:52:17.166459] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451824 ] 00:05:27.715 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.715 [2024-04-25 23:52:17.232509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:27.715 [2024-04-25 23:52:17.270861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.715 [2024-04-25 23:52:17.270946] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:27.715 [2024-04-25 23:52:17.271030] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:27.715 [2024-04-25 23:52:17.271032] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:27.715 23:52:17 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:27.715 23:52:17 -- common/autotest_common.sh@852 -- # return 0 00:05:27.715 23:52:17 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:27.715 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.715 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.975 POWER: Env isn't set yet! 00:05:27.975 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:27.975 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:27.975 POWER: Cannot set governor of lcore 0 to userspace 00:05:27.975 POWER: Attempting to initialise PSTAT power management... 00:05:27.975 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:27.975 POWER: Initialized successfully for lcore 0 power management 00:05:27.975 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:27.975 POWER: Initialized successfully for lcore 1 power management 00:05:27.975 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:27.975 POWER: Initialized successfully for lcore 2 power management 00:05:27.975 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:27.975 POWER: Initialized successfully for lcore 3 power management 00:05:27.975 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.975 23:52:17 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:27.975 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.975 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.975 [2024-04-25 23:52:17.423553] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:27.975 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.975 23:52:17 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:27.975 23:52:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:27.975 23:52:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:27.975 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.975 ************************************ 00:05:27.975 START TEST scheduler_create_thread 00:05:27.976 ************************************ 00:05:27.976 23:52:17 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 2 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 3 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 4 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 5 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 6 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 7 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 8 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 9 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 10 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 23:52:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:27.976 23:52:17 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:27.976 23:52:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:27.976 23:52:17 -- common/autotest_common.sh@10 -- # set +x 00:05:28.546 23:52:18 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:28.546 23:52:18 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:28.546 23:52:18 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:28.546 23:52:18 -- common/autotest_common.sh@10 -- # set +x 00:05:29.927 23:52:19 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:29.927 23:52:19 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:29.927 23:52:19 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:29.927 23:52:19 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:29.927 23:52:19 -- common/autotest_common.sh@10 -- # set +x 00:05:31.305 23:52:20 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:31.305 00:05:31.305 real 0m3.100s 00:05:31.305 user 0m0.023s 00:05:31.305 sys 0m0.006s 00:05:31.305 23:52:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.305 23:52:20 -- common/autotest_common.sh@10 -- # set +x 00:05:31.305 ************************************ 00:05:31.305 END TEST scheduler_create_thread 00:05:31.305 ************************************ 00:05:31.305 23:52:20 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:31.305 23:52:20 -- scheduler/scheduler.sh@46 -- # killprocess 451824 00:05:31.305 23:52:20 -- common/autotest_common.sh@926 -- # '[' -z 451824 ']' 00:05:31.305 23:52:20 -- common/autotest_common.sh@930 -- # kill -0 451824 00:05:31.305 23:52:20 -- common/autotest_common.sh@931 -- # uname 00:05:31.305 23:52:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:31.305 23:52:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 451824 00:05:31.305 23:52:20 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:31.305 23:52:20 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:31.305 23:52:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 451824' 00:05:31.305 killing process with pid 451824 00:05:31.305 23:52:20 -- common/autotest_common.sh@945 -- # kill 451824 00:05:31.305 23:52:20 -- common/autotest_common.sh@950 -- # wait 451824 00:05:31.305 [2024-04-25 23:52:20.910730] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:31.566 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:31.566 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:31.566 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:31.566 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:31.566 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:31.566 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:31.566 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:31.566 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:31.566 00:05:31.566 real 0m4.051s 00:05:31.566 user 0m6.556s 00:05:31.566 sys 0m0.351s 00:05:31.566 23:52:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:31.566 23:52:21 -- common/autotest_common.sh@10 -- # set +x 00:05:31.566 ************************************ 00:05:31.566 END TEST event_scheduler 00:05:31.566 ************************************ 00:05:31.566 23:52:21 -- event/event.sh@51 -- # modprobe -n nbd 00:05:31.566 23:52:21 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:31.566 23:52:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:31.566 23:52:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:31.566 23:52:21 -- common/autotest_common.sh@10 -- # set +x 00:05:31.566 ************************************ 00:05:31.566 START TEST app_repeat 00:05:31.566 ************************************ 00:05:31.566 23:52:21 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:31.566 23:52:21 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:31.566 23:52:21 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:31.566 23:52:21 -- event/event.sh@13 -- # local nbd_list 00:05:31.566 23:52:21 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:31.566 23:52:21 -- event/event.sh@14 -- # local bdev_list 00:05:31.566 23:52:21 -- event/event.sh@15 -- # local repeat_times=4 00:05:31.566 23:52:21 -- event/event.sh@17 -- # modprobe nbd 00:05:31.566 23:52:21 -- event/event.sh@19 -- # repeat_pid=452559 00:05:31.566 23:52:21 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:31.566 23:52:21 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:31.566 23:52:21 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 452559' 00:05:31.566 Process app_repeat pid: 452559 00:05:31.566 23:52:21 -- event/event.sh@23 -- # for i in {0..2} 00:05:31.566 23:52:21 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:31.566 spdk_app_start Round 0 00:05:31.566 23:52:21 -- event/event.sh@25 -- # waitforlisten 452559 /var/tmp/spdk-nbd.sock 00:05:31.566 23:52:21 -- common/autotest_common.sh@819 -- # '[' -z 452559 ']' 00:05:31.566 23:52:21 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:31.566 23:52:21 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:31.566 23:52:21 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:31.566 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:31.566 23:52:21 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:31.566 23:52:21 -- common/autotest_common.sh@10 -- # set +x 00:05:31.827 [2024-04-25 23:52:21.186757] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:31.827 [2024-04-25 23:52:21.186845] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid452559 ] 00:05:31.827 EAL: No free 2048 kB hugepages reported on node 1 00:05:31.827 [2024-04-25 23:52:21.257324] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:31.827 [2024-04-25 23:52:21.293138] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:31.827 [2024-04-25 23:52:21.293141] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.396 23:52:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:32.396 23:52:22 -- common/autotest_common.sh@852 -- # return 0 00:05:32.396 23:52:22 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.656 Malloc0 00:05:32.656 23:52:22 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.916 Malloc1 00:05:32.916 23:52:22 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@12 -- # local i 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.916 23:52:22 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:33.175 /dev/nbd0 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:33.175 23:52:22 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:33.175 23:52:22 -- common/autotest_common.sh@857 -- # local i 00:05:33.175 23:52:22 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:33.175 23:52:22 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:33.175 23:52:22 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:33.175 23:52:22 -- common/autotest_common.sh@861 -- # break 00:05:33.175 23:52:22 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:33.175 23:52:22 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:33.175 23:52:22 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:33.175 1+0 records in 00:05:33.175 1+0 records out 00:05:33.175 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256548 s, 16.0 MB/s 00:05:33.175 23:52:22 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.175 23:52:22 -- common/autotest_common.sh@874 -- # size=4096 00:05:33.175 23:52:22 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.175 23:52:22 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:33.175 23:52:22 -- common/autotest_common.sh@877 -- # return 0 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:33.175 /dev/nbd1 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:33.175 23:52:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:33.175 23:52:22 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:33.176 23:52:22 -- common/autotest_common.sh@857 -- # local i 00:05:33.176 23:52:22 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:33.176 23:52:22 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:33.176 23:52:22 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:33.176 23:52:22 -- common/autotest_common.sh@861 -- # break 00:05:33.176 23:52:22 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:33.176 23:52:22 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:33.176 23:52:22 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:33.435 1+0 records in 00:05:33.435 1+0 records out 00:05:33.435 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00016269 s, 25.2 MB/s 00:05:33.435 23:52:22 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.435 23:52:22 -- common/autotest_common.sh@874 -- # size=4096 00:05:33.435 23:52:22 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:33.435 23:52:22 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:33.435 23:52:22 -- common/autotest_common.sh@877 -- # return 0 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:33.435 { 00:05:33.435 "nbd_device": "/dev/nbd0", 00:05:33.435 "bdev_name": "Malloc0" 00:05:33.435 }, 00:05:33.435 { 00:05:33.435 "nbd_device": "/dev/nbd1", 00:05:33.435 "bdev_name": "Malloc1" 00:05:33.435 } 00:05:33.435 ]' 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:33.435 { 00:05:33.435 "nbd_device": "/dev/nbd0", 00:05:33.435 "bdev_name": "Malloc0" 00:05:33.435 }, 00:05:33.435 { 00:05:33.435 "nbd_device": "/dev/nbd1", 00:05:33.435 "bdev_name": "Malloc1" 00:05:33.435 } 00:05:33.435 ]' 00:05:33.435 23:52:22 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:33.435 /dev/nbd1' 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:33.435 /dev/nbd1' 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@65 -- # count=2 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@95 -- # count=2 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:33.435 256+0 records in 00:05:33.435 256+0 records out 00:05:33.435 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114575 s, 91.5 MB/s 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:33.435 23:52:23 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:33.695 256+0 records in 00:05:33.695 256+0 records out 00:05:33.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201349 s, 52.1 MB/s 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:33.695 256+0 records in 00:05:33.695 256+0 records out 00:05:33.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020863 s, 50.3 MB/s 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@51 -- # local i 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.695 23:52:23 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@41 -- # break 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@41 -- # break 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.955 23:52:23 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@65 -- # true 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@65 -- # count=0 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@104 -- # count=0 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:34.216 23:52:23 -- bdev/nbd_common.sh@109 -- # return 0 00:05:34.216 23:52:23 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:34.475 23:52:23 -- event/event.sh@35 -- # sleep 3 00:05:34.475 [2024-04-25 23:52:24.084633] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:34.736 [2024-04-25 23:52:24.117226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:34.736 [2024-04-25 23:52:24.117228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.736 [2024-04-25 23:52:24.156634] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:34.736 [2024-04-25 23:52:24.156676] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:38.024 23:52:26 -- event/event.sh@23 -- # for i in {0..2} 00:05:38.024 23:52:26 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:38.024 spdk_app_start Round 1 00:05:38.024 23:52:26 -- event/event.sh@25 -- # waitforlisten 452559 /var/tmp/spdk-nbd.sock 00:05:38.024 23:52:26 -- common/autotest_common.sh@819 -- # '[' -z 452559 ']' 00:05:38.024 23:52:26 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:38.024 23:52:26 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:38.024 23:52:26 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:38.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:38.024 23:52:26 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:38.024 23:52:26 -- common/autotest_common.sh@10 -- # set +x 00:05:38.024 23:52:27 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:38.024 23:52:27 -- common/autotest_common.sh@852 -- # return 0 00:05:38.024 23:52:27 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:38.024 Malloc0 00:05:38.024 23:52:27 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:38.024 Malloc1 00:05:38.024 23:52:27 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@12 -- # local i 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:38.024 23:52:27 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:38.024 /dev/nbd0 00:05:38.283 23:52:27 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:38.283 23:52:27 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:38.283 23:52:27 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:38.283 23:52:27 -- common/autotest_common.sh@857 -- # local i 00:05:38.283 23:52:27 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:38.283 23:52:27 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:38.283 23:52:27 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:38.283 23:52:27 -- common/autotest_common.sh@861 -- # break 00:05:38.283 23:52:27 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:38.283 23:52:27 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:38.284 23:52:27 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:38.284 1+0 records in 00:05:38.284 1+0 records out 00:05:38.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227877 s, 18.0 MB/s 00:05:38.284 23:52:27 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:38.284 23:52:27 -- common/autotest_common.sh@874 -- # size=4096 00:05:38.284 23:52:27 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:38.284 23:52:27 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:38.284 23:52:27 -- common/autotest_common.sh@877 -- # return 0 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:38.284 /dev/nbd1 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:38.284 23:52:27 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:38.284 23:52:27 -- common/autotest_common.sh@857 -- # local i 00:05:38.284 23:52:27 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:38.284 23:52:27 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:38.284 23:52:27 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:38.284 23:52:27 -- common/autotest_common.sh@861 -- # break 00:05:38.284 23:52:27 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:38.284 23:52:27 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:38.284 23:52:27 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:38.284 1+0 records in 00:05:38.284 1+0 records out 00:05:38.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245693 s, 16.7 MB/s 00:05:38.284 23:52:27 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:38.284 23:52:27 -- common/autotest_common.sh@874 -- # size=4096 00:05:38.284 23:52:27 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:38.284 23:52:27 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:38.284 23:52:27 -- common/autotest_common.sh@877 -- # return 0 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.284 23:52:27 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:38.543 { 00:05:38.543 "nbd_device": "/dev/nbd0", 00:05:38.543 "bdev_name": "Malloc0" 00:05:38.543 }, 00:05:38.543 { 00:05:38.543 "nbd_device": "/dev/nbd1", 00:05:38.543 "bdev_name": "Malloc1" 00:05:38.543 } 00:05:38.543 ]' 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:38.543 { 00:05:38.543 "nbd_device": "/dev/nbd0", 00:05:38.543 "bdev_name": "Malloc0" 00:05:38.543 }, 00:05:38.543 { 00:05:38.543 "nbd_device": "/dev/nbd1", 00:05:38.543 "bdev_name": "Malloc1" 00:05:38.543 } 00:05:38.543 ]' 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:38.543 /dev/nbd1' 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:38.543 /dev/nbd1' 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@65 -- # count=2 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:38.543 23:52:28 -- bdev/nbd_common.sh@95 -- # count=2 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:38.544 256+0 records in 00:05:38.544 256+0 records out 00:05:38.544 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114223 s, 91.8 MB/s 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:38.544 256+0 records in 00:05:38.544 256+0 records out 00:05:38.544 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198886 s, 52.7 MB/s 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:38.544 23:52:28 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:38.803 256+0 records in 00:05:38.803 256+0 records out 00:05:38.803 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212913 s, 49.2 MB/s 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:38.803 23:52:28 -- bdev/nbd_common.sh@51 -- # local i 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@41 -- # break 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@45 -- # return 0 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:38.804 23:52:28 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@41 -- # break 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@45 -- # return 0 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:39.063 23:52:28 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:39.064 23:52:28 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@65 -- # true 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@65 -- # count=0 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@104 -- # count=0 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:39.324 23:52:28 -- bdev/nbd_common.sh@109 -- # return 0 00:05:39.324 23:52:28 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:39.584 23:52:28 -- event/event.sh@35 -- # sleep 3 00:05:39.584 [2024-04-25 23:52:29.123363] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:39.584 [2024-04-25 23:52:29.155795] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:39.584 [2024-04-25 23:52:29.155799] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.843 [2024-04-25 23:52:29.195690] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:39.843 [2024-04-25 23:52:29.195729] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:42.381 23:52:31 -- event/event.sh@23 -- # for i in {0..2} 00:05:42.381 23:52:31 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:42.381 spdk_app_start Round 2 00:05:42.381 23:52:31 -- event/event.sh@25 -- # waitforlisten 452559 /var/tmp/spdk-nbd.sock 00:05:42.381 23:52:31 -- common/autotest_common.sh@819 -- # '[' -z 452559 ']' 00:05:42.381 23:52:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:42.381 23:52:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:42.381 23:52:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:42.381 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:42.381 23:52:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:42.381 23:52:31 -- common/autotest_common.sh@10 -- # set +x 00:05:42.640 23:52:32 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:42.640 23:52:32 -- common/autotest_common.sh@852 -- # return 0 00:05:42.640 23:52:32 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:42.900 Malloc0 00:05:42.900 23:52:32 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:42.900 Malloc1 00:05:42.900 23:52:32 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@12 -- # local i 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:42.900 23:52:32 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:43.159 /dev/nbd0 00:05:43.159 23:52:32 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:43.159 23:52:32 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:43.159 23:52:32 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:43.159 23:52:32 -- common/autotest_common.sh@857 -- # local i 00:05:43.159 23:52:32 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:43.159 23:52:32 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:43.159 23:52:32 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:43.159 23:52:32 -- common/autotest_common.sh@861 -- # break 00:05:43.159 23:52:32 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:43.159 23:52:32 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:43.159 23:52:32 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:43.159 1+0 records in 00:05:43.159 1+0 records out 00:05:43.159 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232484 s, 17.6 MB/s 00:05:43.159 23:52:32 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:43.159 23:52:32 -- common/autotest_common.sh@874 -- # size=4096 00:05:43.159 23:52:32 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:43.159 23:52:32 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:43.159 23:52:32 -- common/autotest_common.sh@877 -- # return 0 00:05:43.159 23:52:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:43.159 23:52:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:43.159 23:52:32 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:43.418 /dev/nbd1 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:43.418 23:52:32 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:43.418 23:52:32 -- common/autotest_common.sh@857 -- # local i 00:05:43.418 23:52:32 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:43.418 23:52:32 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:43.418 23:52:32 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:43.418 23:52:32 -- common/autotest_common.sh@861 -- # break 00:05:43.418 23:52:32 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:43.418 23:52:32 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:43.418 23:52:32 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:43.418 1+0 records in 00:05:43.418 1+0 records out 00:05:43.418 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024857 s, 16.5 MB/s 00:05:43.418 23:52:32 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:43.418 23:52:32 -- common/autotest_common.sh@874 -- # size=4096 00:05:43.418 23:52:32 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:43.418 23:52:32 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:43.418 23:52:32 -- common/autotest_common.sh@877 -- # return 0 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.418 23:52:32 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:43.418 23:52:33 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:43.418 { 00:05:43.418 "nbd_device": "/dev/nbd0", 00:05:43.418 "bdev_name": "Malloc0" 00:05:43.418 }, 00:05:43.418 { 00:05:43.418 "nbd_device": "/dev/nbd1", 00:05:43.418 "bdev_name": "Malloc1" 00:05:43.418 } 00:05:43.418 ]' 00:05:43.418 23:52:33 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:43.418 { 00:05:43.418 "nbd_device": "/dev/nbd0", 00:05:43.418 "bdev_name": "Malloc0" 00:05:43.418 }, 00:05:43.418 { 00:05:43.418 "nbd_device": "/dev/nbd1", 00:05:43.418 "bdev_name": "Malloc1" 00:05:43.418 } 00:05:43.418 ]' 00:05:43.418 23:52:33 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:43.677 /dev/nbd1' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:43.677 /dev/nbd1' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@65 -- # count=2 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@95 -- # count=2 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:43.677 256+0 records in 00:05:43.677 256+0 records out 00:05:43.677 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105276 s, 99.6 MB/s 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:43.677 256+0 records in 00:05:43.677 256+0 records out 00:05:43.677 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202018 s, 51.9 MB/s 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:43.677 256+0 records in 00:05:43.677 256+0 records out 00:05:43.677 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209242 s, 50.1 MB/s 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@51 -- # local i 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:43.677 23:52:33 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@41 -- # break 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@45 -- # return 0 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@41 -- # break 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@45 -- # return 0 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.937 23:52:33 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@65 -- # true 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@65 -- # count=0 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@104 -- # count=0 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:44.195 23:52:33 -- bdev/nbd_common.sh@109 -- # return 0 00:05:44.195 23:52:33 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:44.455 23:52:33 -- event/event.sh@35 -- # sleep 3 00:05:44.714 [2024-04-25 23:52:34.099903] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:44.714 [2024-04-25 23:52:34.132245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.714 [2024-04-25 23:52:34.132247] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.714 [2024-04-25 23:52:34.172076] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:44.714 [2024-04-25 23:52:34.172124] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:48.004 23:52:36 -- event/event.sh@38 -- # waitforlisten 452559 /var/tmp/spdk-nbd.sock 00:05:48.004 23:52:36 -- common/autotest_common.sh@819 -- # '[' -z 452559 ']' 00:05:48.004 23:52:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:48.004 23:52:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:48.004 23:52:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:48.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:48.004 23:52:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:48.004 23:52:36 -- common/autotest_common.sh@10 -- # set +x 00:05:48.004 23:52:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:48.004 23:52:37 -- common/autotest_common.sh@852 -- # return 0 00:05:48.004 23:52:37 -- event/event.sh@39 -- # killprocess 452559 00:05:48.004 23:52:37 -- common/autotest_common.sh@926 -- # '[' -z 452559 ']' 00:05:48.004 23:52:37 -- common/autotest_common.sh@930 -- # kill -0 452559 00:05:48.004 23:52:37 -- common/autotest_common.sh@931 -- # uname 00:05:48.004 23:52:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:48.004 23:52:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 452559 00:05:48.004 23:52:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:48.004 23:52:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:48.004 23:52:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 452559' 00:05:48.004 killing process with pid 452559 00:05:48.004 23:52:37 -- common/autotest_common.sh@945 -- # kill 452559 00:05:48.004 23:52:37 -- common/autotest_common.sh@950 -- # wait 452559 00:05:48.004 spdk_app_start is called in Round 0. 00:05:48.004 Shutdown signal received, stop current app iteration 00:05:48.004 Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 reinitialization... 00:05:48.004 spdk_app_start is called in Round 1. 00:05:48.004 Shutdown signal received, stop current app iteration 00:05:48.004 Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 reinitialization... 00:05:48.004 spdk_app_start is called in Round 2. 00:05:48.004 Shutdown signal received, stop current app iteration 00:05:48.004 Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 reinitialization... 00:05:48.004 spdk_app_start is called in Round 3. 00:05:48.004 Shutdown signal received, stop current app iteration 00:05:48.004 23:52:37 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:48.004 23:52:37 -- event/event.sh@42 -- # return 0 00:05:48.004 00:05:48.004 real 0m16.145s 00:05:48.004 user 0m34.324s 00:05:48.004 sys 0m3.075s 00:05:48.004 23:52:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.004 23:52:37 -- common/autotest_common.sh@10 -- # set +x 00:05:48.004 ************************************ 00:05:48.004 END TEST app_repeat 00:05:48.004 ************************************ 00:05:48.004 23:52:37 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:48.004 23:52:37 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:48.004 23:52:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.004 23:52:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.004 23:52:37 -- common/autotest_common.sh@10 -- # set +x 00:05:48.004 ************************************ 00:05:48.004 START TEST cpu_locks 00:05:48.004 ************************************ 00:05:48.004 23:52:37 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:48.004 * Looking for test storage... 00:05:48.004 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:48.004 23:52:37 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:48.004 23:52:37 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:48.004 23:52:37 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:48.004 23:52:37 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:48.004 23:52:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.004 23:52:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.004 23:52:37 -- common/autotest_common.sh@10 -- # set +x 00:05:48.004 ************************************ 00:05:48.004 START TEST default_locks 00:05:48.004 ************************************ 00:05:48.004 23:52:37 -- common/autotest_common.sh@1104 -- # default_locks 00:05:48.004 23:52:37 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=455611 00:05:48.004 23:52:37 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:48.004 23:52:37 -- event/cpu_locks.sh@47 -- # waitforlisten 455611 00:05:48.004 23:52:37 -- common/autotest_common.sh@819 -- # '[' -z 455611 ']' 00:05:48.004 23:52:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.004 23:52:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:48.004 23:52:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.004 23:52:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:48.004 23:52:37 -- common/autotest_common.sh@10 -- # set +x 00:05:48.004 [2024-04-25 23:52:37.491868] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:48.004 [2024-04-25 23:52:37.491947] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid455611 ] 00:05:48.004 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.004 [2024-04-25 23:52:37.558821] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.004 [2024-04-25 23:52:37.595441] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:48.004 [2024-04-25 23:52:37.595562] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.941 23:52:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:48.941 23:52:38 -- common/autotest_common.sh@852 -- # return 0 00:05:48.941 23:52:38 -- event/cpu_locks.sh@49 -- # locks_exist 455611 00:05:48.941 23:52:38 -- event/cpu_locks.sh@22 -- # lslocks -p 455611 00:05:48.941 23:52:38 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:49.507 lslocks: write error 00:05:49.507 23:52:38 -- event/cpu_locks.sh@50 -- # killprocess 455611 00:05:49.507 23:52:38 -- common/autotest_common.sh@926 -- # '[' -z 455611 ']' 00:05:49.507 23:52:38 -- common/autotest_common.sh@930 -- # kill -0 455611 00:05:49.507 23:52:38 -- common/autotest_common.sh@931 -- # uname 00:05:49.507 23:52:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:49.507 23:52:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 455611 00:05:49.507 23:52:39 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:49.507 23:52:39 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:49.507 23:52:39 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 455611' 00:05:49.507 killing process with pid 455611 00:05:49.507 23:52:39 -- common/autotest_common.sh@945 -- # kill 455611 00:05:49.507 23:52:39 -- common/autotest_common.sh@950 -- # wait 455611 00:05:49.770 23:52:39 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 455611 00:05:49.770 23:52:39 -- common/autotest_common.sh@640 -- # local es=0 00:05:49.770 23:52:39 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 455611 00:05:49.770 23:52:39 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:49.770 23:52:39 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:49.770 23:52:39 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:49.770 23:52:39 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:49.770 23:52:39 -- common/autotest_common.sh@643 -- # waitforlisten 455611 00:05:49.770 23:52:39 -- common/autotest_common.sh@819 -- # '[' -z 455611 ']' 00:05:49.770 23:52:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.770 23:52:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:49.770 23:52:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.770 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.770 23:52:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:49.770 23:52:39 -- common/autotest_common.sh@10 -- # set +x 00:05:49.770 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (455611) - No such process 00:05:49.770 ERROR: process (pid: 455611) is no longer running 00:05:49.770 23:52:39 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:49.770 23:52:39 -- common/autotest_common.sh@852 -- # return 1 00:05:49.770 23:52:39 -- common/autotest_common.sh@643 -- # es=1 00:05:49.770 23:52:39 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:49.770 23:52:39 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:49.770 23:52:39 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:49.770 23:52:39 -- event/cpu_locks.sh@54 -- # no_locks 00:05:49.770 23:52:39 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:49.770 23:52:39 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:49.770 23:52:39 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:49.770 00:05:49.770 real 0m1.851s 00:05:49.770 user 0m1.944s 00:05:49.770 sys 0m0.716s 00:05:49.770 23:52:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.770 23:52:39 -- common/autotest_common.sh@10 -- # set +x 00:05:49.770 ************************************ 00:05:49.770 END TEST default_locks 00:05:49.770 ************************************ 00:05:49.770 23:52:39 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:49.770 23:52:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.770 23:52:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.770 23:52:39 -- common/autotest_common.sh@10 -- # set +x 00:05:49.770 ************************************ 00:05:49.770 START TEST default_locks_via_rpc 00:05:49.770 ************************************ 00:05:49.770 23:52:39 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:05:49.770 23:52:39 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=455927 00:05:49.770 23:52:39 -- event/cpu_locks.sh@63 -- # waitforlisten 455927 00:05:49.770 23:52:39 -- common/autotest_common.sh@819 -- # '[' -z 455927 ']' 00:05:49.770 23:52:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.770 23:52:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:49.770 23:52:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.770 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.770 23:52:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:49.770 23:52:39 -- common/autotest_common.sh@10 -- # set +x 00:05:49.770 23:52:39 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:50.140 [2024-04-25 23:52:39.388378] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:50.140 [2024-04-25 23:52:39.388483] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid455927 ] 00:05:50.140 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.140 [2024-04-25 23:52:39.458078] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.140 [2024-04-25 23:52:39.494829] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:50.140 [2024-04-25 23:52:39.494944] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.744 23:52:40 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:50.744 23:52:40 -- common/autotest_common.sh@852 -- # return 0 00:05:50.744 23:52:40 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:50.744 23:52:40 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.744 23:52:40 -- common/autotest_common.sh@10 -- # set +x 00:05:50.744 23:52:40 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.744 23:52:40 -- event/cpu_locks.sh@67 -- # no_locks 00:05:50.744 23:52:40 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:50.744 23:52:40 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:50.744 23:52:40 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:50.745 23:52:40 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:50.745 23:52:40 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.745 23:52:40 -- common/autotest_common.sh@10 -- # set +x 00:05:50.745 23:52:40 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.745 23:52:40 -- event/cpu_locks.sh@71 -- # locks_exist 455927 00:05:50.745 23:52:40 -- event/cpu_locks.sh@22 -- # lslocks -p 455927 00:05:50.745 23:52:40 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:51.015 23:52:40 -- event/cpu_locks.sh@73 -- # killprocess 455927 00:05:51.015 23:52:40 -- common/autotest_common.sh@926 -- # '[' -z 455927 ']' 00:05:51.015 23:52:40 -- common/autotest_common.sh@930 -- # kill -0 455927 00:05:51.015 23:52:40 -- common/autotest_common.sh@931 -- # uname 00:05:51.015 23:52:40 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:51.015 23:52:40 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 455927 00:05:51.288 23:52:40 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:51.288 23:52:40 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:51.288 23:52:40 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 455927' 00:05:51.288 killing process with pid 455927 00:05:51.288 23:52:40 -- common/autotest_common.sh@945 -- # kill 455927 00:05:51.288 23:52:40 -- common/autotest_common.sh@950 -- # wait 455927 00:05:51.546 00:05:51.546 real 0m1.564s 00:05:51.546 user 0m1.617s 00:05:51.546 sys 0m0.529s 00:05:51.546 23:52:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.546 23:52:40 -- common/autotest_common.sh@10 -- # set +x 00:05:51.546 ************************************ 00:05:51.546 END TEST default_locks_via_rpc 00:05:51.546 ************************************ 00:05:51.546 23:52:40 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:51.546 23:52:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:51.546 23:52:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:51.546 23:52:40 -- common/autotest_common.sh@10 -- # set +x 00:05:51.546 ************************************ 00:05:51.546 START TEST non_locking_app_on_locked_coremask 00:05:51.546 ************************************ 00:05:51.546 23:52:40 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:05:51.546 23:52:40 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=456245 00:05:51.546 23:52:40 -- event/cpu_locks.sh@81 -- # waitforlisten 456245 /var/tmp/spdk.sock 00:05:51.546 23:52:40 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:51.546 23:52:40 -- common/autotest_common.sh@819 -- # '[' -z 456245 ']' 00:05:51.546 23:52:40 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.546 23:52:40 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:51.546 23:52:40 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.546 23:52:40 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:51.546 23:52:40 -- common/autotest_common.sh@10 -- # set +x 00:05:51.546 [2024-04-25 23:52:41.003305] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:51.546 [2024-04-25 23:52:41.003409] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid456245 ] 00:05:51.546 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.546 [2024-04-25 23:52:41.072817] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.546 [2024-04-25 23:52:41.108337] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:51.546 [2024-04-25 23:52:41.108465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.481 23:52:41 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:52.481 23:52:41 -- common/autotest_common.sh@852 -- # return 0 00:05:52.481 23:52:41 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:52.481 23:52:41 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=456493 00:05:52.481 23:52:41 -- event/cpu_locks.sh@85 -- # waitforlisten 456493 /var/tmp/spdk2.sock 00:05:52.481 23:52:41 -- common/autotest_common.sh@819 -- # '[' -z 456493 ']' 00:05:52.481 23:52:41 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:52.481 23:52:41 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:52.481 23:52:41 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:52.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:52.481 23:52:41 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:52.481 23:52:41 -- common/autotest_common.sh@10 -- # set +x 00:05:52.481 [2024-04-25 23:52:41.834682] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:52.481 [2024-04-25 23:52:41.834765] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid456493 ] 00:05:52.481 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.481 [2024-04-25 23:52:41.923940] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:52.481 [2024-04-25 23:52:41.923967] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.481 [2024-04-25 23:52:41.996017] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:52.481 [2024-04-25 23:52:41.996151] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.048 23:52:42 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:53.048 23:52:42 -- common/autotest_common.sh@852 -- # return 0 00:05:53.048 23:52:42 -- event/cpu_locks.sh@87 -- # locks_exist 456245 00:05:53.048 23:52:42 -- event/cpu_locks.sh@22 -- # lslocks -p 456245 00:05:53.048 23:52:42 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:54.420 lslocks: write error 00:05:54.420 23:52:43 -- event/cpu_locks.sh@89 -- # killprocess 456245 00:05:54.420 23:52:43 -- common/autotest_common.sh@926 -- # '[' -z 456245 ']' 00:05:54.420 23:52:43 -- common/autotest_common.sh@930 -- # kill -0 456245 00:05:54.420 23:52:43 -- common/autotest_common.sh@931 -- # uname 00:05:54.420 23:52:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:54.420 23:52:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 456245 00:05:54.420 23:52:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:54.420 23:52:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:54.420 23:52:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 456245' 00:05:54.420 killing process with pid 456245 00:05:54.420 23:52:43 -- common/autotest_common.sh@945 -- # kill 456245 00:05:54.420 23:52:43 -- common/autotest_common.sh@950 -- # wait 456245 00:05:54.984 23:52:44 -- event/cpu_locks.sh@90 -- # killprocess 456493 00:05:54.984 23:52:44 -- common/autotest_common.sh@926 -- # '[' -z 456493 ']' 00:05:54.984 23:52:44 -- common/autotest_common.sh@930 -- # kill -0 456493 00:05:54.984 23:52:44 -- common/autotest_common.sh@931 -- # uname 00:05:54.984 23:52:44 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:54.984 23:52:44 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 456493 00:05:54.984 23:52:44 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:54.984 23:52:44 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:54.984 23:52:44 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 456493' 00:05:54.984 killing process with pid 456493 00:05:54.984 23:52:44 -- common/autotest_common.sh@945 -- # kill 456493 00:05:54.984 23:52:44 -- common/autotest_common.sh@950 -- # wait 456493 00:05:55.243 00:05:55.243 real 0m3.796s 00:05:55.243 user 0m4.035s 00:05:55.243 sys 0m1.230s 00:05:55.243 23:52:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:55.243 23:52:44 -- common/autotest_common.sh@10 -- # set +x 00:05:55.243 ************************************ 00:05:55.243 END TEST non_locking_app_on_locked_coremask 00:05:55.243 ************************************ 00:05:55.243 23:52:44 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:55.243 23:52:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:55.243 23:52:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:55.243 23:52:44 -- common/autotest_common.sh@10 -- # set +x 00:05:55.243 ************************************ 00:05:55.243 START TEST locking_app_on_unlocked_coremask 00:05:55.243 ************************************ 00:05:55.243 23:52:44 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:05:55.243 23:52:44 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=457067 00:05:55.243 23:52:44 -- event/cpu_locks.sh@99 -- # waitforlisten 457067 /var/tmp/spdk.sock 00:05:55.243 23:52:44 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:55.243 23:52:44 -- common/autotest_common.sh@819 -- # '[' -z 457067 ']' 00:05:55.243 23:52:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:55.243 23:52:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:55.243 23:52:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:55.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:55.243 23:52:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:55.243 23:52:44 -- common/autotest_common.sh@10 -- # set +x 00:05:55.243 [2024-04-25 23:52:44.846616] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:55.243 [2024-04-25 23:52:44.846703] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457067 ] 00:05:55.502 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.502 [2024-04-25 23:52:44.915207] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:55.502 [2024-04-25 23:52:44.915238] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.502 [2024-04-25 23:52:44.948464] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:55.502 [2024-04-25 23:52:44.948580] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.069 23:52:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:56.069 23:52:45 -- common/autotest_common.sh@852 -- # return 0 00:05:56.069 23:52:45 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:56.069 23:52:45 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=457131 00:05:56.069 23:52:45 -- event/cpu_locks.sh@103 -- # waitforlisten 457131 /var/tmp/spdk2.sock 00:05:56.069 23:52:45 -- common/autotest_common.sh@819 -- # '[' -z 457131 ']' 00:05:56.069 23:52:45 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:56.069 23:52:45 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:56.069 23:52:45 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:56.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:56.069 23:52:45 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:56.069 23:52:45 -- common/autotest_common.sh@10 -- # set +x 00:05:56.069 [2024-04-25 23:52:45.676588] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:56.069 [2024-04-25 23:52:45.676662] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457131 ] 00:05:56.328 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.328 [2024-04-25 23:52:45.771420] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.328 [2024-04-25 23:52:45.843349] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:56.328 [2024-04-25 23:52:45.843492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.262 23:52:46 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:57.262 23:52:46 -- common/autotest_common.sh@852 -- # return 0 00:05:57.262 23:52:46 -- event/cpu_locks.sh@105 -- # locks_exist 457131 00:05:57.262 23:52:46 -- event/cpu_locks.sh@22 -- # lslocks -p 457131 00:05:57.262 23:52:46 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:58.195 lslocks: write error 00:05:58.195 23:52:47 -- event/cpu_locks.sh@107 -- # killprocess 457067 00:05:58.195 23:52:47 -- common/autotest_common.sh@926 -- # '[' -z 457067 ']' 00:05:58.195 23:52:47 -- common/autotest_common.sh@930 -- # kill -0 457067 00:05:58.195 23:52:47 -- common/autotest_common.sh@931 -- # uname 00:05:58.195 23:52:47 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:58.195 23:52:47 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 457067 00:05:58.195 23:52:47 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:58.195 23:52:47 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:58.195 23:52:47 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 457067' 00:05:58.195 killing process with pid 457067 00:05:58.195 23:52:47 -- common/autotest_common.sh@945 -- # kill 457067 00:05:58.195 23:52:47 -- common/autotest_common.sh@950 -- # wait 457067 00:05:58.761 23:52:48 -- event/cpu_locks.sh@108 -- # killprocess 457131 00:05:58.761 23:52:48 -- common/autotest_common.sh@926 -- # '[' -z 457131 ']' 00:05:58.761 23:52:48 -- common/autotest_common.sh@930 -- # kill -0 457131 00:05:58.761 23:52:48 -- common/autotest_common.sh@931 -- # uname 00:05:58.761 23:52:48 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:58.761 23:52:48 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 457131 00:05:58.761 23:52:48 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:58.761 23:52:48 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:58.761 23:52:48 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 457131' 00:05:58.761 killing process with pid 457131 00:05:58.761 23:52:48 -- common/autotest_common.sh@945 -- # kill 457131 00:05:58.761 23:52:48 -- common/autotest_common.sh@950 -- # wait 457131 00:05:59.019 00:05:59.019 real 0m3.673s 00:05:59.019 user 0m3.885s 00:05:59.019 sys 0m1.230s 00:05:59.019 23:52:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.019 23:52:48 -- common/autotest_common.sh@10 -- # set +x 00:05:59.019 ************************************ 00:05:59.019 END TEST locking_app_on_unlocked_coremask 00:05:59.019 ************************************ 00:05:59.019 23:52:48 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:59.019 23:52:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:59.019 23:52:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:59.019 23:52:48 -- common/autotest_common.sh@10 -- # set +x 00:05:59.019 ************************************ 00:05:59.019 START TEST locking_app_on_locked_coremask 00:05:59.019 ************************************ 00:05:59.019 23:52:48 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:05:59.019 23:52:48 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=457656 00:05:59.019 23:52:48 -- event/cpu_locks.sh@116 -- # waitforlisten 457656 /var/tmp/spdk.sock 00:05:59.019 23:52:48 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:59.019 23:52:48 -- common/autotest_common.sh@819 -- # '[' -z 457656 ']' 00:05:59.019 23:52:48 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:59.019 23:52:48 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:59.019 23:52:48 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:59.019 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:59.019 23:52:48 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:59.019 23:52:48 -- common/autotest_common.sh@10 -- # set +x 00:05:59.019 [2024-04-25 23:52:48.570216] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:59.019 [2024-04-25 23:52:48.570308] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457656 ] 00:05:59.019 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.276 [2024-04-25 23:52:48.640279] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.276 [2024-04-25 23:52:48.674969] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:59.276 [2024-04-25 23:52:48.675104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.842 23:52:49 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:59.842 23:52:49 -- common/autotest_common.sh@852 -- # return 0 00:05:59.842 23:52:49 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=457924 00:05:59.842 23:52:49 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 457924 /var/tmp/spdk2.sock 00:05:59.842 23:52:49 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:59.842 23:52:49 -- common/autotest_common.sh@640 -- # local es=0 00:05:59.842 23:52:49 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 457924 /var/tmp/spdk2.sock 00:05:59.842 23:52:49 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:59.842 23:52:49 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:59.842 23:52:49 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:59.842 23:52:49 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:59.842 23:52:49 -- common/autotest_common.sh@643 -- # waitforlisten 457924 /var/tmp/spdk2.sock 00:05:59.842 23:52:49 -- common/autotest_common.sh@819 -- # '[' -z 457924 ']' 00:05:59.842 23:52:49 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:59.842 23:52:49 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:59.842 23:52:49 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:59.842 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:59.842 23:52:49 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:59.842 23:52:49 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 [2024-04-25 23:52:49.408818] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:05:59.842 [2024-04-25 23:52:49.408909] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457924 ] 00:05:59.842 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.100 [2024-04-25 23:52:49.499338] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 457656 has claimed it. 00:06:00.100 [2024-04-25 23:52:49.499375] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:00.666 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (457924) - No such process 00:06:00.666 ERROR: process (pid: 457924) is no longer running 00:06:00.666 23:52:50 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:00.666 23:52:50 -- common/autotest_common.sh@852 -- # return 1 00:06:00.666 23:52:50 -- common/autotest_common.sh@643 -- # es=1 00:06:00.666 23:52:50 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:00.666 23:52:50 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:00.666 23:52:50 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:00.666 23:52:50 -- event/cpu_locks.sh@122 -- # locks_exist 457656 00:06:00.666 23:52:50 -- event/cpu_locks.sh@22 -- # lslocks -p 457656 00:06:00.666 23:52:50 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:00.925 lslocks: write error 00:06:00.925 23:52:50 -- event/cpu_locks.sh@124 -- # killprocess 457656 00:06:00.925 23:52:50 -- common/autotest_common.sh@926 -- # '[' -z 457656 ']' 00:06:00.925 23:52:50 -- common/autotest_common.sh@930 -- # kill -0 457656 00:06:00.925 23:52:50 -- common/autotest_common.sh@931 -- # uname 00:06:00.925 23:52:50 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:00.925 23:52:50 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 457656 00:06:00.925 23:52:50 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:00.925 23:52:50 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:00.925 23:52:50 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 457656' 00:06:00.925 killing process with pid 457656 00:06:00.925 23:52:50 -- common/autotest_common.sh@945 -- # kill 457656 00:06:00.925 23:52:50 -- common/autotest_common.sh@950 -- # wait 457656 00:06:01.183 00:06:01.183 real 0m2.221s 00:06:01.183 user 0m2.399s 00:06:01.183 sys 0m0.666s 00:06:01.183 23:52:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.183 23:52:50 -- common/autotest_common.sh@10 -- # set +x 00:06:01.183 ************************************ 00:06:01.183 END TEST locking_app_on_locked_coremask 00:06:01.183 ************************************ 00:06:01.442 23:52:50 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:01.442 23:52:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:01.442 23:52:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.442 23:52:50 -- common/autotest_common.sh@10 -- # set +x 00:06:01.442 ************************************ 00:06:01.442 START TEST locking_overlapped_coremask 00:06:01.442 ************************************ 00:06:01.442 23:52:50 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:01.442 23:52:50 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=458218 00:06:01.442 23:52:50 -- event/cpu_locks.sh@133 -- # waitforlisten 458218 /var/tmp/spdk.sock 00:06:01.442 23:52:50 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:01.442 23:52:50 -- common/autotest_common.sh@819 -- # '[' -z 458218 ']' 00:06:01.442 23:52:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.442 23:52:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:01.442 23:52:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.442 23:52:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:01.442 23:52:50 -- common/autotest_common.sh@10 -- # set +x 00:06:01.442 [2024-04-25 23:52:50.839521] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:01.442 [2024-04-25 23:52:50.839595] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid458218 ] 00:06:01.442 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.443 [2024-04-25 23:52:50.907600] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:01.443 [2024-04-25 23:52:50.941049] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:01.443 [2024-04-25 23:52:50.941241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:01.443 [2024-04-25 23:52:50.941344] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:01.443 [2024-04-25 23:52:50.941346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.380 23:52:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:02.380 23:52:51 -- common/autotest_common.sh@852 -- # return 0 00:06:02.380 23:52:51 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=458241 00:06:02.380 23:52:51 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 458241 /var/tmp/spdk2.sock 00:06:02.380 23:52:51 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:02.380 23:52:51 -- common/autotest_common.sh@640 -- # local es=0 00:06:02.380 23:52:51 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 458241 /var/tmp/spdk2.sock 00:06:02.380 23:52:51 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:02.380 23:52:51 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:02.380 23:52:51 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:02.380 23:52:51 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:02.380 23:52:51 -- common/autotest_common.sh@643 -- # waitforlisten 458241 /var/tmp/spdk2.sock 00:06:02.380 23:52:51 -- common/autotest_common.sh@819 -- # '[' -z 458241 ']' 00:06:02.380 23:52:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:02.380 23:52:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:02.380 23:52:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:02.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:02.380 23:52:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:02.380 23:52:51 -- common/autotest_common.sh@10 -- # set +x 00:06:02.380 [2024-04-25 23:52:51.667369] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:02.380 [2024-04-25 23:52:51.667434] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid458241 ] 00:06:02.380 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.380 [2024-04-25 23:52:51.761337] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 458218 has claimed it. 00:06:02.380 [2024-04-25 23:52:51.761377] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:02.949 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (458241) - No such process 00:06:02.949 ERROR: process (pid: 458241) is no longer running 00:06:02.949 23:52:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:02.949 23:52:52 -- common/autotest_common.sh@852 -- # return 1 00:06:02.949 23:52:52 -- common/autotest_common.sh@643 -- # es=1 00:06:02.949 23:52:52 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:02.949 23:52:52 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:02.949 23:52:52 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:02.949 23:52:52 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:02.949 23:52:52 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:02.949 23:52:52 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:02.949 23:52:52 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:02.949 23:52:52 -- event/cpu_locks.sh@141 -- # killprocess 458218 00:06:02.949 23:52:52 -- common/autotest_common.sh@926 -- # '[' -z 458218 ']' 00:06:02.949 23:52:52 -- common/autotest_common.sh@930 -- # kill -0 458218 00:06:02.949 23:52:52 -- common/autotest_common.sh@931 -- # uname 00:06:02.949 23:52:52 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:02.949 23:52:52 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 458218 00:06:02.949 23:52:52 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:02.949 23:52:52 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:02.949 23:52:52 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 458218' 00:06:02.949 killing process with pid 458218 00:06:02.949 23:52:52 -- common/autotest_common.sh@945 -- # kill 458218 00:06:02.949 23:52:52 -- common/autotest_common.sh@950 -- # wait 458218 00:06:03.209 00:06:03.209 real 0m1.833s 00:06:03.209 user 0m5.237s 00:06:03.209 sys 0m0.431s 00:06:03.209 23:52:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.209 23:52:52 -- common/autotest_common.sh@10 -- # set +x 00:06:03.209 ************************************ 00:06:03.209 END TEST locking_overlapped_coremask 00:06:03.209 ************************************ 00:06:03.209 23:52:52 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:03.209 23:52:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:03.209 23:52:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:03.209 23:52:52 -- common/autotest_common.sh@10 -- # set +x 00:06:03.209 ************************************ 00:06:03.209 START TEST locking_overlapped_coremask_via_rpc 00:06:03.209 ************************************ 00:06:03.209 23:52:52 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:03.209 23:52:52 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=458533 00:06:03.209 23:52:52 -- event/cpu_locks.sh@149 -- # waitforlisten 458533 /var/tmp/spdk.sock 00:06:03.209 23:52:52 -- common/autotest_common.sh@819 -- # '[' -z 458533 ']' 00:06:03.209 23:52:52 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.209 23:52:52 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:03.209 23:52:52 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.209 23:52:52 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:03.209 23:52:52 -- common/autotest_common.sh@10 -- # set +x 00:06:03.209 23:52:52 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:03.209 [2024-04-25 23:52:52.714996] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:03.209 [2024-04-25 23:52:52.715075] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid458533 ] 00:06:03.209 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.209 [2024-04-25 23:52:52.784048] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:03.209 [2024-04-25 23:52:52.784071] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:03.468 [2024-04-25 23:52:52.822623] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:03.468 [2024-04-25 23:52:52.822756] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:03.468 [2024-04-25 23:52:52.822850] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:03.468 [2024-04-25 23:52:52.822852] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.036 23:52:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:04.036 23:52:53 -- common/autotest_common.sh@852 -- # return 0 00:06:04.036 23:52:53 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=458657 00:06:04.036 23:52:53 -- event/cpu_locks.sh@153 -- # waitforlisten 458657 /var/tmp/spdk2.sock 00:06:04.036 23:52:53 -- common/autotest_common.sh@819 -- # '[' -z 458657 ']' 00:06:04.036 23:52:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:04.036 23:52:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:04.036 23:52:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:04.036 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:04.036 23:52:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:04.036 23:52:53 -- common/autotest_common.sh@10 -- # set +x 00:06:04.036 23:52:53 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:04.036 [2024-04-25 23:52:53.548486] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:04.036 [2024-04-25 23:52:53.548569] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid458657 ] 00:06:04.036 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.036 [2024-04-25 23:52:53.643185] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:04.036 [2024-04-25 23:52:53.643213] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:04.294 [2024-04-25 23:52:53.717143] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:04.294 [2024-04-25 23:52:53.717307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:04.294 [2024-04-25 23:52:53.720464] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:04.294 [2024-04-25 23:52:53.720465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:04.862 23:52:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:04.863 23:52:54 -- common/autotest_common.sh@852 -- # return 0 00:06:04.863 23:52:54 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:04.863 23:52:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:04.863 23:52:54 -- common/autotest_common.sh@10 -- # set +x 00:06:04.863 23:52:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:04.863 23:52:54 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:04.863 23:52:54 -- common/autotest_common.sh@640 -- # local es=0 00:06:04.863 23:52:54 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:04.863 23:52:54 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:04.863 23:52:54 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:04.863 23:52:54 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:04.863 23:52:54 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:04.863 23:52:54 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:04.863 23:52:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:04.863 23:52:54 -- common/autotest_common.sh@10 -- # set +x 00:06:04.863 [2024-04-25 23:52:54.385451] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 458533 has claimed it. 00:06:04.863 request: 00:06:04.863 { 00:06:04.863 "method": "framework_enable_cpumask_locks", 00:06:04.863 "req_id": 1 00:06:04.863 } 00:06:04.863 Got JSON-RPC error response 00:06:04.863 response: 00:06:04.863 { 00:06:04.863 "code": -32603, 00:06:04.863 "message": "Failed to claim CPU core: 2" 00:06:04.863 } 00:06:04.863 23:52:54 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:04.863 23:52:54 -- common/autotest_common.sh@643 -- # es=1 00:06:04.863 23:52:54 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:04.863 23:52:54 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:04.863 23:52:54 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:04.863 23:52:54 -- event/cpu_locks.sh@158 -- # waitforlisten 458533 /var/tmp/spdk.sock 00:06:04.863 23:52:54 -- common/autotest_common.sh@819 -- # '[' -z 458533 ']' 00:06:04.863 23:52:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:04.863 23:52:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:04.863 23:52:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:04.863 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:04.863 23:52:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:04.863 23:52:54 -- common/autotest_common.sh@10 -- # set +x 00:06:05.121 23:52:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:05.121 23:52:54 -- common/autotest_common.sh@852 -- # return 0 00:06:05.121 23:52:54 -- event/cpu_locks.sh@159 -- # waitforlisten 458657 /var/tmp/spdk2.sock 00:06:05.121 23:52:54 -- common/autotest_common.sh@819 -- # '[' -z 458657 ']' 00:06:05.121 23:52:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:05.121 23:52:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:05.122 23:52:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:05.122 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:05.122 23:52:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:05.122 23:52:54 -- common/autotest_common.sh@10 -- # set +x 00:06:05.381 23:52:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:05.381 23:52:54 -- common/autotest_common.sh@852 -- # return 0 00:06:05.381 23:52:54 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:05.381 23:52:54 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:05.381 23:52:54 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:05.381 23:52:54 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:05.381 00:06:05.381 real 0m2.060s 00:06:05.381 user 0m0.778s 00:06:05.381 sys 0m0.213s 00:06:05.381 23:52:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.381 23:52:54 -- common/autotest_common.sh@10 -- # set +x 00:06:05.381 ************************************ 00:06:05.381 END TEST locking_overlapped_coremask_via_rpc 00:06:05.381 ************************************ 00:06:05.381 23:52:54 -- event/cpu_locks.sh@174 -- # cleanup 00:06:05.381 23:52:54 -- event/cpu_locks.sh@15 -- # [[ -z 458533 ]] 00:06:05.381 23:52:54 -- event/cpu_locks.sh@15 -- # killprocess 458533 00:06:05.381 23:52:54 -- common/autotest_common.sh@926 -- # '[' -z 458533 ']' 00:06:05.381 23:52:54 -- common/autotest_common.sh@930 -- # kill -0 458533 00:06:05.381 23:52:54 -- common/autotest_common.sh@931 -- # uname 00:06:05.381 23:52:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:05.381 23:52:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 458533 00:06:05.381 23:52:54 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:05.381 23:52:54 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:05.381 23:52:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 458533' 00:06:05.381 killing process with pid 458533 00:06:05.381 23:52:54 -- common/autotest_common.sh@945 -- # kill 458533 00:06:05.381 23:52:54 -- common/autotest_common.sh@950 -- # wait 458533 00:06:05.640 23:52:55 -- event/cpu_locks.sh@16 -- # [[ -z 458657 ]] 00:06:05.640 23:52:55 -- event/cpu_locks.sh@16 -- # killprocess 458657 00:06:05.640 23:52:55 -- common/autotest_common.sh@926 -- # '[' -z 458657 ']' 00:06:05.640 23:52:55 -- common/autotest_common.sh@930 -- # kill -0 458657 00:06:05.640 23:52:55 -- common/autotest_common.sh@931 -- # uname 00:06:05.640 23:52:55 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:05.640 23:52:55 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 458657 00:06:05.640 23:52:55 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:05.640 23:52:55 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:05.640 23:52:55 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 458657' 00:06:05.640 killing process with pid 458657 00:06:05.640 23:52:55 -- common/autotest_common.sh@945 -- # kill 458657 00:06:05.640 23:52:55 -- common/autotest_common.sh@950 -- # wait 458657 00:06:05.899 23:52:55 -- event/cpu_locks.sh@18 -- # rm -f 00:06:05.899 23:52:55 -- event/cpu_locks.sh@1 -- # cleanup 00:06:05.899 23:52:55 -- event/cpu_locks.sh@15 -- # [[ -z 458533 ]] 00:06:05.899 23:52:55 -- event/cpu_locks.sh@15 -- # killprocess 458533 00:06:05.899 23:52:55 -- common/autotest_common.sh@926 -- # '[' -z 458533 ']' 00:06:05.899 23:52:55 -- common/autotest_common.sh@930 -- # kill -0 458533 00:06:05.899 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (458533) - No such process 00:06:05.899 23:52:55 -- common/autotest_common.sh@953 -- # echo 'Process with pid 458533 is not found' 00:06:05.899 Process with pid 458533 is not found 00:06:05.899 23:52:55 -- event/cpu_locks.sh@16 -- # [[ -z 458657 ]] 00:06:05.899 23:52:55 -- event/cpu_locks.sh@16 -- # killprocess 458657 00:06:05.899 23:52:55 -- common/autotest_common.sh@926 -- # '[' -z 458657 ']' 00:06:05.899 23:52:55 -- common/autotest_common.sh@930 -- # kill -0 458657 00:06:05.899 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (458657) - No such process 00:06:05.899 23:52:55 -- common/autotest_common.sh@953 -- # echo 'Process with pid 458657 is not found' 00:06:05.899 Process with pid 458657 is not found 00:06:05.899 23:52:55 -- event/cpu_locks.sh@18 -- # rm -f 00:06:05.899 00:06:05.899 real 0m18.151s 00:06:05.899 user 0m30.422s 00:06:05.899 sys 0m5.933s 00:06:05.899 23:52:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.899 23:52:55 -- common/autotest_common.sh@10 -- # set +x 00:06:05.899 ************************************ 00:06:05.899 END TEST cpu_locks 00:06:05.899 ************************************ 00:06:06.158 00:06:06.158 real 0m42.273s 00:06:06.158 user 1m17.702s 00:06:06.158 sys 0m9.937s 00:06:06.158 23:52:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:06.158 23:52:55 -- common/autotest_common.sh@10 -- # set +x 00:06:06.158 ************************************ 00:06:06.158 END TEST event 00:06:06.158 ************************************ 00:06:06.158 23:52:55 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:06.158 23:52:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:06.158 23:52:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:06.158 23:52:55 -- common/autotest_common.sh@10 -- # set +x 00:06:06.158 ************************************ 00:06:06.158 START TEST thread 00:06:06.158 ************************************ 00:06:06.158 23:52:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:06.158 * Looking for test storage... 00:06:06.158 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:06.158 23:52:55 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:06.158 23:52:55 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:06.159 23:52:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:06.159 23:52:55 -- common/autotest_common.sh@10 -- # set +x 00:06:06.159 ************************************ 00:06:06.159 START TEST thread_poller_perf 00:06:06.159 ************************************ 00:06:06.159 23:52:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:06.159 [2024-04-25 23:52:55.718336] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:06.159 [2024-04-25 23:52:55.718451] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459168 ] 00:06:06.159 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.417 [2024-04-25 23:52:55.788349] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.417 [2024-04-25 23:52:55.824861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.417 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:07.353 ====================================== 00:06:07.353 busy:2505965406 (cyc) 00:06:07.353 total_run_count: 807000 00:06:07.353 tsc_hz: 2500000000 (cyc) 00:06:07.353 ====================================== 00:06:07.353 poller_cost: 3105 (cyc), 1242 (nsec) 00:06:07.353 00:06:07.353 real 0m1.181s 00:06:07.353 user 0m1.092s 00:06:07.353 sys 0m0.085s 00:06:07.353 23:52:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.353 23:52:56 -- common/autotest_common.sh@10 -- # set +x 00:06:07.353 ************************************ 00:06:07.353 END TEST thread_poller_perf 00:06:07.353 ************************************ 00:06:07.353 23:52:56 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:07.353 23:52:56 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:07.353 23:52:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:07.353 23:52:56 -- common/autotest_common.sh@10 -- # set +x 00:06:07.353 ************************************ 00:06:07.353 START TEST thread_poller_perf 00:06:07.353 ************************************ 00:06:07.353 23:52:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:07.353 [2024-04-25 23:52:56.949159] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:07.353 [2024-04-25 23:52:56.949247] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459452 ] 00:06:07.612 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.612 [2024-04-25 23:52:57.021124] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.612 [2024-04-25 23:52:57.056642] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.612 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:08.548 ====================================== 00:06:08.548 busy:2501925834 (cyc) 00:06:08.548 total_run_count: 14369000 00:06:08.548 tsc_hz: 2500000000 (cyc) 00:06:08.548 ====================================== 00:06:08.548 poller_cost: 174 (cyc), 69 (nsec) 00:06:08.548 00:06:08.548 real 0m1.179s 00:06:08.548 user 0m1.088s 00:06:08.548 sys 0m0.088s 00:06:08.548 23:52:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.548 23:52:58 -- common/autotest_common.sh@10 -- # set +x 00:06:08.548 ************************************ 00:06:08.548 END TEST thread_poller_perf 00:06:08.548 ************************************ 00:06:08.548 23:52:58 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:08.548 23:52:58 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:08.548 23:52:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:08.548 23:52:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:08.548 23:52:58 -- common/autotest_common.sh@10 -- # set +x 00:06:08.806 ************************************ 00:06:08.806 START TEST thread_spdk_lock 00:06:08.806 ************************************ 00:06:08.806 23:52:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:08.806 [2024-04-25 23:52:58.178613] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:08.806 [2024-04-25 23:52:58.178741] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459598 ] 00:06:08.806 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.806 [2024-04-25 23:52:58.250194] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:08.806 [2024-04-25 23:52:58.286272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:08.806 [2024-04-25 23:52:58.286276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.374 [2024-04-25 23:52:58.779130] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:09.374 [2024-04-25 23:52:58.779165] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:09.374 [2024-04-25 23:52:58.779176] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x133de80 00:06:09.374 [2024-04-25 23:52:58.780034] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:09.374 [2024-04-25 23:52:58.780137] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:09.375 [2024-04-25 23:52:58.780155] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:09.375 Starting test contend 00:06:09.375 Worker Delay Wait us Hold us Total us 00:06:09.375 0 3 169715 188839 358555 00:06:09.375 1 5 83423 290641 374065 00:06:09.375 PASS test contend 00:06:09.375 Starting test hold_by_poller 00:06:09.375 PASS test hold_by_poller 00:06:09.375 Starting test hold_by_message 00:06:09.375 PASS test hold_by_message 00:06:09.375 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:09.375 100014 assertions passed 00:06:09.375 0 assertions failed 00:06:09.375 00:06:09.375 real 0m0.670s 00:06:09.375 user 0m1.074s 00:06:09.375 sys 0m0.087s 00:06:09.375 23:52:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.375 23:52:58 -- common/autotest_common.sh@10 -- # set +x 00:06:09.375 ************************************ 00:06:09.375 END TEST thread_spdk_lock 00:06:09.375 ************************************ 00:06:09.375 00:06:09.375 real 0m3.279s 00:06:09.375 user 0m3.353s 00:06:09.375 sys 0m0.444s 00:06:09.375 23:52:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.375 23:52:58 -- common/autotest_common.sh@10 -- # set +x 00:06:09.375 ************************************ 00:06:09.375 END TEST thread 00:06:09.375 ************************************ 00:06:09.375 23:52:58 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:09.375 23:52:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:09.375 23:52:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:09.375 23:52:58 -- common/autotest_common.sh@10 -- # set +x 00:06:09.375 ************************************ 00:06:09.375 START TEST accel 00:06:09.375 ************************************ 00:06:09.375 23:52:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:09.634 * Looking for test storage... 00:06:09.634 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:09.634 23:52:59 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:09.634 23:52:59 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:09.634 23:52:59 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:09.634 23:52:59 -- accel/accel.sh@59 -- # spdk_tgt_pid=459808 00:06:09.634 23:52:59 -- accel/accel.sh@60 -- # waitforlisten 459808 00:06:09.634 23:52:59 -- common/autotest_common.sh@819 -- # '[' -z 459808 ']' 00:06:09.634 23:52:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.634 23:52:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:09.634 23:52:59 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:09.634 23:52:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.634 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.634 23:52:59 -- accel/accel.sh@58 -- # build_accel_config 00:06:09.634 23:52:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:09.634 23:52:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:09.634 23:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:09.634 23:52:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.634 23:52:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.634 23:52:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:09.634 23:52:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:09.634 23:52:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:09.634 23:52:59 -- accel/accel.sh@42 -- # jq -r . 00:06:09.634 [2024-04-25 23:52:59.052128] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:09.634 [2024-04-25 23:52:59.052204] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459808 ] 00:06:09.634 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.634 [2024-04-25 23:52:59.119796] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.634 [2024-04-25 23:52:59.158487] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:09.634 [2024-04-25 23:52:59.158634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.574 23:52:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:10.574 23:52:59 -- common/autotest_common.sh@852 -- # return 0 00:06:10.574 23:52:59 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:10.574 23:52:59 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:10.574 23:52:59 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:10.574 23:52:59 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:10.574 23:52:59 -- common/autotest_common.sh@10 -- # set +x 00:06:10.574 23:52:59 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # IFS== 00:06:10.574 23:52:59 -- accel/accel.sh@64 -- # read -r opc module 00:06:10.574 23:52:59 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:10.574 23:52:59 -- accel/accel.sh@67 -- # killprocess 459808 00:06:10.574 23:52:59 -- common/autotest_common.sh@926 -- # '[' -z 459808 ']' 00:06:10.574 23:52:59 -- common/autotest_common.sh@930 -- # kill -0 459808 00:06:10.574 23:52:59 -- common/autotest_common.sh@931 -- # uname 00:06:10.574 23:52:59 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:10.574 23:52:59 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 459808 00:06:10.574 23:52:59 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:10.574 23:52:59 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:10.574 23:52:59 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 459808' 00:06:10.574 killing process with pid 459808 00:06:10.574 23:52:59 -- common/autotest_common.sh@945 -- # kill 459808 00:06:10.574 23:52:59 -- common/autotest_common.sh@950 -- # wait 459808 00:06:10.833 23:53:00 -- accel/accel.sh@68 -- # trap - ERR 00:06:10.834 23:53:00 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:10.834 23:53:00 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:10.834 23:53:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:10.834 23:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:10.834 23:53:00 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:10.834 23:53:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:10.834 23:53:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.834 23:53:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.834 23:53:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.834 23:53:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.834 23:53:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.834 23:53:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.834 23:53:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.834 23:53:00 -- accel/accel.sh@42 -- # jq -r . 00:06:10.834 23:53:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.834 23:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:10.834 23:53:00 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:10.834 23:53:00 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:10.834 23:53:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:10.834 23:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:10.834 ************************************ 00:06:10.834 START TEST accel_missing_filename 00:06:10.834 ************************************ 00:06:10.834 23:53:00 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:10.834 23:53:00 -- common/autotest_common.sh@640 -- # local es=0 00:06:10.834 23:53:00 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:10.834 23:53:00 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:10.834 23:53:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:10.834 23:53:00 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:10.834 23:53:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:10.834 23:53:00 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:10.834 23:53:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:10.834 23:53:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.834 23:53:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.834 23:53:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.834 23:53:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.834 23:53:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.834 23:53:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.834 23:53:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.834 23:53:00 -- accel/accel.sh@42 -- # jq -r . 00:06:10.834 [2024-04-25 23:53:00.362853] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:10.834 [2024-04-25 23:53:00.362953] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460108 ] 00:06:10.834 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.834 [2024-04-25 23:53:00.434627] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.093 [2024-04-25 23:53:00.472149] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.093 [2024-04-25 23:53:00.512095] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:11.093 [2024-04-25 23:53:00.572641] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:11.093 A filename is required. 00:06:11.093 23:53:00 -- common/autotest_common.sh@643 -- # es=234 00:06:11.093 23:53:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:11.093 23:53:00 -- common/autotest_common.sh@652 -- # es=106 00:06:11.093 23:53:00 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:11.093 23:53:00 -- common/autotest_common.sh@660 -- # es=1 00:06:11.093 23:53:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:11.093 00:06:11.093 real 0m0.292s 00:06:11.093 user 0m0.201s 00:06:11.093 sys 0m0.128s 00:06:11.093 23:53:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.093 23:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:11.093 ************************************ 00:06:11.093 END TEST accel_missing_filename 00:06:11.093 ************************************ 00:06:11.093 23:53:00 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:11.093 23:53:00 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:11.093 23:53:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.093 23:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:11.093 ************************************ 00:06:11.093 START TEST accel_compress_verify 00:06:11.093 ************************************ 00:06:11.093 23:53:00 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:11.093 23:53:00 -- common/autotest_common.sh@640 -- # local es=0 00:06:11.093 23:53:00 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:11.093 23:53:00 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:11.093 23:53:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:11.093 23:53:00 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:11.093 23:53:00 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:11.093 23:53:00 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:11.093 23:53:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:11.093 23:53:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.093 23:53:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.093 23:53:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.093 23:53:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.093 23:53:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.093 23:53:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.093 23:53:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.093 23:53:00 -- accel/accel.sh@42 -- # jq -r . 00:06:11.093 [2024-04-25 23:53:00.703479] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:11.093 [2024-04-25 23:53:00.703572] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460138 ] 00:06:11.352 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.352 [2024-04-25 23:53:00.775678] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.352 [2024-04-25 23:53:00.811582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.352 [2024-04-25 23:53:00.852040] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:11.352 [2024-04-25 23:53:00.911668] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:11.611 00:06:11.611 Compression does not support the verify option, aborting. 00:06:11.611 23:53:00 -- common/autotest_common.sh@643 -- # es=161 00:06:11.611 23:53:00 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:11.611 23:53:00 -- common/autotest_common.sh@652 -- # es=33 00:06:11.611 23:53:00 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:11.611 23:53:00 -- common/autotest_common.sh@660 -- # es=1 00:06:11.611 23:53:00 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:11.611 00:06:11.611 real 0m0.291s 00:06:11.611 user 0m0.200s 00:06:11.611 sys 0m0.131s 00:06:11.611 23:53:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.611 23:53:00 -- common/autotest_common.sh@10 -- # set +x 00:06:11.611 ************************************ 00:06:11.611 END TEST accel_compress_verify 00:06:11.611 ************************************ 00:06:11.611 23:53:01 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:11.611 23:53:01 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:11.611 23:53:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.611 23:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:11.611 ************************************ 00:06:11.611 START TEST accel_wrong_workload 00:06:11.611 ************************************ 00:06:11.611 23:53:01 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:11.611 23:53:01 -- common/autotest_common.sh@640 -- # local es=0 00:06:11.611 23:53:01 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:11.611 23:53:01 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:11.611 23:53:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:11.611 23:53:01 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:11.611 23:53:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:11.611 23:53:01 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:11.611 23:53:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:11.611 23:53:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.611 23:53:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.611 23:53:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.611 23:53:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.611 23:53:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.611 23:53:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.611 23:53:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.611 23:53:01 -- accel/accel.sh@42 -- # jq -r . 00:06:11.611 Unsupported workload type: foobar 00:06:11.611 [2024-04-25 23:53:01.038512] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:11.611 accel_perf options: 00:06:11.611 [-h help message] 00:06:11.611 [-q queue depth per core] 00:06:11.611 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:11.611 [-T number of threads per core 00:06:11.611 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:11.611 [-t time in seconds] 00:06:11.611 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:11.611 [ dif_verify, , dif_generate, dif_generate_copy 00:06:11.611 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:11.611 [-l for compress/decompress workloads, name of uncompressed input file 00:06:11.611 [-S for crc32c workload, use this seed value (default 0) 00:06:11.611 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:11.611 [-f for fill workload, use this BYTE value (default 255) 00:06:11.611 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:11.611 [-y verify result if this switch is on] 00:06:11.611 [-a tasks to allocate per core (default: same value as -q)] 00:06:11.611 Can be used to spread operations across a wider range of memory. 00:06:11.611 23:53:01 -- common/autotest_common.sh@643 -- # es=1 00:06:11.611 23:53:01 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:11.611 23:53:01 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:11.611 23:53:01 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:11.611 00:06:11.611 real 0m0.029s 00:06:11.611 user 0m0.009s 00:06:11.611 sys 0m0.019s 00:06:11.611 23:53:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.611 23:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:11.611 ************************************ 00:06:11.611 END TEST accel_wrong_workload 00:06:11.611 ************************************ 00:06:11.611 Error: writing output failed: Broken pipe 00:06:11.611 23:53:01 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:11.611 23:53:01 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:11.611 23:53:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.611 23:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:11.611 ************************************ 00:06:11.611 START TEST accel_negative_buffers 00:06:11.611 ************************************ 00:06:11.611 23:53:01 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:11.611 23:53:01 -- common/autotest_common.sh@640 -- # local es=0 00:06:11.611 23:53:01 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:11.611 23:53:01 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:11.611 23:53:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:11.611 23:53:01 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:11.611 23:53:01 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:11.611 23:53:01 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:11.611 23:53:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:11.611 23:53:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.611 23:53:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.611 23:53:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.611 23:53:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.611 23:53:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.611 23:53:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.611 23:53:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.611 23:53:01 -- accel/accel.sh@42 -- # jq -r . 00:06:11.611 -x option must be non-negative. 00:06:11.611 [2024-04-25 23:53:01.110653] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:11.611 accel_perf options: 00:06:11.611 [-h help message] 00:06:11.611 [-q queue depth per core] 00:06:11.611 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:11.611 [-T number of threads per core 00:06:11.611 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:11.611 [-t time in seconds] 00:06:11.611 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:11.611 [ dif_verify, , dif_generate, dif_generate_copy 00:06:11.611 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:11.612 [-l for compress/decompress workloads, name of uncompressed input file 00:06:11.612 [-S for crc32c workload, use this seed value (default 0) 00:06:11.612 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:11.612 [-f for fill workload, use this BYTE value (default 255) 00:06:11.612 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:11.612 [-y verify result if this switch is on] 00:06:11.612 [-a tasks to allocate per core (default: same value as -q)] 00:06:11.612 Can be used to spread operations across a wider range of memory. 00:06:11.612 23:53:01 -- common/autotest_common.sh@643 -- # es=1 00:06:11.612 23:53:01 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:11.612 23:53:01 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:11.612 23:53:01 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:11.612 00:06:11.612 real 0m0.028s 00:06:11.612 user 0m0.013s 00:06:11.612 sys 0m0.015s 00:06:11.612 23:53:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.612 23:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:11.612 ************************************ 00:06:11.612 END TEST accel_negative_buffers 00:06:11.612 ************************************ 00:06:11.612 Error: writing output failed: Broken pipe 00:06:11.612 23:53:01 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:11.612 23:53:01 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:11.612 23:53:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.612 23:53:01 -- common/autotest_common.sh@10 -- # set +x 00:06:11.612 ************************************ 00:06:11.612 START TEST accel_crc32c 00:06:11.612 ************************************ 00:06:11.612 23:53:01 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:11.612 23:53:01 -- accel/accel.sh@16 -- # local accel_opc 00:06:11.612 23:53:01 -- accel/accel.sh@17 -- # local accel_module 00:06:11.612 23:53:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:11.612 23:53:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:11.612 23:53:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.612 23:53:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.612 23:53:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.612 23:53:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.612 23:53:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.612 23:53:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.612 23:53:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.612 23:53:01 -- accel/accel.sh@42 -- # jq -r . 00:06:11.612 [2024-04-25 23:53:01.179656] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:11.612 [2024-04-25 23:53:01.179736] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460270 ] 00:06:11.612 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.871 [2024-04-25 23:53:01.248798] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.871 [2024-04-25 23:53:01.284509] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.252 23:53:02 -- accel/accel.sh@18 -- # out=' 00:06:13.252 SPDK Configuration: 00:06:13.252 Core mask: 0x1 00:06:13.252 00:06:13.252 Accel Perf Configuration: 00:06:13.252 Workload Type: crc32c 00:06:13.252 CRC-32C seed: 32 00:06:13.252 Transfer size: 4096 bytes 00:06:13.252 Vector count 1 00:06:13.252 Module: software 00:06:13.252 Queue depth: 32 00:06:13.252 Allocate depth: 32 00:06:13.252 # threads/core: 1 00:06:13.252 Run time: 1 seconds 00:06:13.252 Verify: Yes 00:06:13.252 00:06:13.252 Running for 1 seconds... 00:06:13.252 00:06:13.252 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:13.252 ------------------------------------------------------------------------------------ 00:06:13.252 0,0 844096/s 3297 MiB/s 0 0 00:06:13.252 ==================================================================================== 00:06:13.252 Total 844096/s 3297 MiB/s 0 0' 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:13.252 23:53:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:13.252 23:53:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.252 23:53:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:13.252 23:53:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.252 23:53:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.252 23:53:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:13.252 23:53:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:13.252 23:53:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:13.252 23:53:02 -- accel/accel.sh@42 -- # jq -r . 00:06:13.252 [2024-04-25 23:53:02.463455] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:13.252 [2024-04-25 23:53:02.463544] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460461 ] 00:06:13.252 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.252 [2024-04-25 23:53:02.532738] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.252 [2024-04-25 23:53:02.566925] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=0x1 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=crc32c 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=32 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=software 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@23 -- # accel_module=software 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=32 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=32 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=1 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val=Yes 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:13.252 23:53:02 -- accel/accel.sh@21 -- # val= 00:06:13.252 23:53:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # IFS=: 00:06:13.252 23:53:02 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@21 -- # val= 00:06:14.189 23:53:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # IFS=: 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@21 -- # val= 00:06:14.189 23:53:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # IFS=: 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@21 -- # val= 00:06:14.189 23:53:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # IFS=: 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@21 -- # val= 00:06:14.189 23:53:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # IFS=: 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@21 -- # val= 00:06:14.189 23:53:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # IFS=: 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@21 -- # val= 00:06:14.189 23:53:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # IFS=: 00:06:14.189 23:53:03 -- accel/accel.sh@20 -- # read -r var val 00:06:14.189 23:53:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:14.189 23:53:03 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:14.189 23:53:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:14.189 00:06:14.189 real 0m2.572s 00:06:14.189 user 0m2.334s 00:06:14.189 sys 0m0.247s 00:06:14.189 23:53:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.189 23:53:03 -- common/autotest_common.sh@10 -- # set +x 00:06:14.189 ************************************ 00:06:14.189 END TEST accel_crc32c 00:06:14.189 ************************************ 00:06:14.189 23:53:03 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:14.189 23:53:03 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:14.189 23:53:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:14.189 23:53:03 -- common/autotest_common.sh@10 -- # set +x 00:06:14.189 ************************************ 00:06:14.189 START TEST accel_crc32c_C2 00:06:14.189 ************************************ 00:06:14.189 23:53:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:14.189 23:53:03 -- accel/accel.sh@16 -- # local accel_opc 00:06:14.189 23:53:03 -- accel/accel.sh@17 -- # local accel_module 00:06:14.189 23:53:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:14.189 23:53:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:14.189 23:53:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.189 23:53:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.189 23:53:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.189 23:53:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.189 23:53:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.189 23:53:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.189 23:53:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.189 23:53:03 -- accel/accel.sh@42 -- # jq -r . 00:06:14.448 [2024-04-25 23:53:03.800887] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:14.448 [2024-04-25 23:53:03.801001] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460750 ] 00:06:14.448 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.448 [2024-04-25 23:53:03.869877] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.448 [2024-04-25 23:53:03.904806] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.827 23:53:05 -- accel/accel.sh@18 -- # out=' 00:06:15.827 SPDK Configuration: 00:06:15.828 Core mask: 0x1 00:06:15.828 00:06:15.828 Accel Perf Configuration: 00:06:15.828 Workload Type: crc32c 00:06:15.828 CRC-32C seed: 0 00:06:15.828 Transfer size: 4096 bytes 00:06:15.828 Vector count 2 00:06:15.828 Module: software 00:06:15.828 Queue depth: 32 00:06:15.828 Allocate depth: 32 00:06:15.828 # threads/core: 1 00:06:15.828 Run time: 1 seconds 00:06:15.828 Verify: Yes 00:06:15.828 00:06:15.828 Running for 1 seconds... 00:06:15.828 00:06:15.828 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:15.828 ------------------------------------------------------------------------------------ 00:06:15.828 0,0 610240/s 4767 MiB/s 0 0 00:06:15.828 ==================================================================================== 00:06:15.828 Total 610240/s 2383 MiB/s 0 0' 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:15.828 23:53:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:15.828 23:53:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:15.828 23:53:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:15.828 23:53:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:15.828 23:53:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:15.828 23:53:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:15.828 23:53:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:15.828 23:53:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:15.828 23:53:05 -- accel/accel.sh@42 -- # jq -r . 00:06:15.828 [2024-04-25 23:53:05.083541] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:15.828 [2024-04-25 23:53:05.083631] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461017 ] 00:06:15.828 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.828 [2024-04-25 23:53:05.152564] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.828 [2024-04-25 23:53:05.186758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=0x1 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=crc32c 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=0 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=software 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@23 -- # accel_module=software 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=32 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=32 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=1 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val=Yes 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:15.828 23:53:05 -- accel/accel.sh@21 -- # val= 00:06:15.828 23:53:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # IFS=: 00:06:15.828 23:53:05 -- accel/accel.sh@20 -- # read -r var val 00:06:16.765 23:53:06 -- accel/accel.sh@21 -- # val= 00:06:16.766 23:53:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # IFS=: 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # read -r var val 00:06:16.766 23:53:06 -- accel/accel.sh@21 -- # val= 00:06:16.766 23:53:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # IFS=: 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # read -r var val 00:06:16.766 23:53:06 -- accel/accel.sh@21 -- # val= 00:06:16.766 23:53:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # IFS=: 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # read -r var val 00:06:16.766 23:53:06 -- accel/accel.sh@21 -- # val= 00:06:16.766 23:53:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # IFS=: 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # read -r var val 00:06:16.766 23:53:06 -- accel/accel.sh@21 -- # val= 00:06:16.766 23:53:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # IFS=: 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # read -r var val 00:06:16.766 23:53:06 -- accel/accel.sh@21 -- # val= 00:06:16.766 23:53:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # IFS=: 00:06:16.766 23:53:06 -- accel/accel.sh@20 -- # read -r var val 00:06:16.766 23:53:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:16.766 23:53:06 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:16.766 23:53:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.766 00:06:16.766 real 0m2.572s 00:06:16.766 user 0m2.328s 00:06:16.766 sys 0m0.253s 00:06:16.766 23:53:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.766 23:53:06 -- common/autotest_common.sh@10 -- # set +x 00:06:16.766 ************************************ 00:06:16.766 END TEST accel_crc32c_C2 00:06:16.766 ************************************ 00:06:17.025 23:53:06 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:17.025 23:53:06 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:17.025 23:53:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:17.025 23:53:06 -- common/autotest_common.sh@10 -- # set +x 00:06:17.025 ************************************ 00:06:17.025 START TEST accel_copy 00:06:17.025 ************************************ 00:06:17.025 23:53:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:17.025 23:53:06 -- accel/accel.sh@16 -- # local accel_opc 00:06:17.025 23:53:06 -- accel/accel.sh@17 -- # local accel_module 00:06:17.025 23:53:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:17.025 23:53:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:17.025 23:53:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:17.025 23:53:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:17.025 23:53:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:17.025 23:53:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:17.025 23:53:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:17.025 23:53:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:17.025 23:53:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:17.025 23:53:06 -- accel/accel.sh@42 -- # jq -r . 00:06:17.025 [2024-04-25 23:53:06.422285] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:17.025 [2024-04-25 23:53:06.422378] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461305 ] 00:06:17.025 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.025 [2024-04-25 23:53:06.491278] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.025 [2024-04-25 23:53:06.524763] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.402 23:53:07 -- accel/accel.sh@18 -- # out=' 00:06:18.402 SPDK Configuration: 00:06:18.402 Core mask: 0x1 00:06:18.402 00:06:18.402 Accel Perf Configuration: 00:06:18.402 Workload Type: copy 00:06:18.402 Transfer size: 4096 bytes 00:06:18.402 Vector count 1 00:06:18.402 Module: software 00:06:18.402 Queue depth: 32 00:06:18.402 Allocate depth: 32 00:06:18.402 # threads/core: 1 00:06:18.402 Run time: 1 seconds 00:06:18.402 Verify: Yes 00:06:18.402 00:06:18.402 Running for 1 seconds... 00:06:18.402 00:06:18.402 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:18.402 ------------------------------------------------------------------------------------ 00:06:18.402 0,0 560032/s 2187 MiB/s 0 0 00:06:18.402 ==================================================================================== 00:06:18.402 Total 560032/s 2187 MiB/s 0 0' 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:18.402 23:53:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:18.402 23:53:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.402 23:53:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.402 23:53:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.402 23:53:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.402 23:53:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.402 23:53:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.402 23:53:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.402 23:53:07 -- accel/accel.sh@42 -- # jq -r . 00:06:18.402 [2024-04-25 23:53:07.707378] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:18.402 [2024-04-25 23:53:07.707482] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461553 ] 00:06:18.402 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.402 [2024-04-25 23:53:07.777761] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.402 [2024-04-25 23:53:07.811637] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=0x1 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=copy 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=software 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@23 -- # accel_module=software 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=32 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=32 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=1 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val=Yes 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:18.402 23:53:07 -- accel/accel.sh@21 -- # val= 00:06:18.402 23:53:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # IFS=: 00:06:18.402 23:53:07 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@21 -- # val= 00:06:19.780 23:53:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # IFS=: 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@21 -- # val= 00:06:19.780 23:53:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # IFS=: 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@21 -- # val= 00:06:19.780 23:53:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # IFS=: 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@21 -- # val= 00:06:19.780 23:53:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # IFS=: 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@21 -- # val= 00:06:19.780 23:53:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # IFS=: 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@21 -- # val= 00:06:19.780 23:53:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # IFS=: 00:06:19.780 23:53:08 -- accel/accel.sh@20 -- # read -r var val 00:06:19.780 23:53:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:19.780 23:53:08 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:19.780 23:53:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:19.780 00:06:19.780 real 0m2.578s 00:06:19.780 user 0m2.322s 00:06:19.780 sys 0m0.266s 00:06:19.780 23:53:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.780 23:53:08 -- common/autotest_common.sh@10 -- # set +x 00:06:19.780 ************************************ 00:06:19.781 END TEST accel_copy 00:06:19.781 ************************************ 00:06:19.781 23:53:09 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:19.781 23:53:09 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:19.781 23:53:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.781 23:53:09 -- common/autotest_common.sh@10 -- # set +x 00:06:19.781 ************************************ 00:06:19.781 START TEST accel_fill 00:06:19.781 ************************************ 00:06:19.781 23:53:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:19.781 23:53:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:19.781 23:53:09 -- accel/accel.sh@17 -- # local accel_module 00:06:19.781 23:53:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:19.781 23:53:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:19.781 23:53:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:19.781 23:53:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:19.781 23:53:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:19.781 23:53:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:19.781 23:53:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:19.781 23:53:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:19.781 23:53:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:19.781 23:53:09 -- accel/accel.sh@42 -- # jq -r . 00:06:19.781 [2024-04-25 23:53:09.047000] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:19.781 [2024-04-25 23:53:09.047096] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461744 ] 00:06:19.781 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.781 [2024-04-25 23:53:09.116179] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.781 [2024-04-25 23:53:09.151150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.717 23:53:10 -- accel/accel.sh@18 -- # out=' 00:06:20.718 SPDK Configuration: 00:06:20.718 Core mask: 0x1 00:06:20.718 00:06:20.718 Accel Perf Configuration: 00:06:20.718 Workload Type: fill 00:06:20.718 Fill pattern: 0x80 00:06:20.718 Transfer size: 4096 bytes 00:06:20.718 Vector count 1 00:06:20.718 Module: software 00:06:20.718 Queue depth: 64 00:06:20.718 Allocate depth: 64 00:06:20.718 # threads/core: 1 00:06:20.718 Run time: 1 seconds 00:06:20.718 Verify: Yes 00:06:20.718 00:06:20.718 Running for 1 seconds... 00:06:20.718 00:06:20.718 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:20.718 ------------------------------------------------------------------------------------ 00:06:20.718 0,0 969920/s 3788 MiB/s 0 0 00:06:20.718 ==================================================================================== 00:06:20.718 Total 969920/s 3788 MiB/s 0 0' 00:06:20.718 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.718 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.718 23:53:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:20.718 23:53:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:20.718 23:53:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.718 23:53:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.718 23:53:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.718 23:53:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.718 23:53:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.718 23:53:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.718 23:53:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.718 23:53:10 -- accel/accel.sh@42 -- # jq -r . 00:06:20.977 [2024-04-25 23:53:10.333625] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:20.977 [2024-04-25 23:53:10.333731] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461889 ] 00:06:20.977 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.977 [2024-04-25 23:53:10.402836] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.977 [2024-04-25 23:53:10.437579] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=0x1 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=fill 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=0x80 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=software 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@23 -- # accel_module=software 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=64 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=64 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=1 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val=Yes 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:20.977 23:53:10 -- accel/accel.sh@21 -- # val= 00:06:20.977 23:53:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # IFS=: 00:06:20.977 23:53:10 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@21 -- # val= 00:06:22.356 23:53:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # IFS=: 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@21 -- # val= 00:06:22.356 23:53:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # IFS=: 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@21 -- # val= 00:06:22.356 23:53:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # IFS=: 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@21 -- # val= 00:06:22.356 23:53:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # IFS=: 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@21 -- # val= 00:06:22.356 23:53:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # IFS=: 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@21 -- # val= 00:06:22.356 23:53:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # IFS=: 00:06:22.356 23:53:11 -- accel/accel.sh@20 -- # read -r var val 00:06:22.356 23:53:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:22.356 23:53:11 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:22.356 23:53:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:22.356 00:06:22.356 real 0m2.578s 00:06:22.356 user 0m2.325s 00:06:22.356 sys 0m0.263s 00:06:22.356 23:53:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.356 23:53:11 -- common/autotest_common.sh@10 -- # set +x 00:06:22.356 ************************************ 00:06:22.356 END TEST accel_fill 00:06:22.356 ************************************ 00:06:22.356 23:53:11 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:22.356 23:53:11 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:22.356 23:53:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:22.356 23:53:11 -- common/autotest_common.sh@10 -- # set +x 00:06:22.356 ************************************ 00:06:22.356 START TEST accel_copy_crc32c 00:06:22.356 ************************************ 00:06:22.356 23:53:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:22.356 23:53:11 -- accel/accel.sh@16 -- # local accel_opc 00:06:22.356 23:53:11 -- accel/accel.sh@17 -- # local accel_module 00:06:22.356 23:53:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:22.356 23:53:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:22.356 23:53:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.356 23:53:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.356 23:53:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.356 23:53:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.356 23:53:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.356 23:53:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.356 23:53:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.356 23:53:11 -- accel/accel.sh@42 -- # jq -r . 00:06:22.356 [2024-04-25 23:53:11.674034] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:22.356 [2024-04-25 23:53:11.674124] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462160 ] 00:06:22.356 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.356 [2024-04-25 23:53:11.743990] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.356 [2024-04-25 23:53:11.779180] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.764 23:53:12 -- accel/accel.sh@18 -- # out=' 00:06:23.764 SPDK Configuration: 00:06:23.764 Core mask: 0x1 00:06:23.764 00:06:23.764 Accel Perf Configuration: 00:06:23.764 Workload Type: copy_crc32c 00:06:23.764 CRC-32C seed: 0 00:06:23.764 Vector size: 4096 bytes 00:06:23.764 Transfer size: 4096 bytes 00:06:23.764 Vector count 1 00:06:23.764 Module: software 00:06:23.764 Queue depth: 32 00:06:23.764 Allocate depth: 32 00:06:23.764 # threads/core: 1 00:06:23.764 Run time: 1 seconds 00:06:23.764 Verify: Yes 00:06:23.764 00:06:23.765 Running for 1 seconds... 00:06:23.765 00:06:23.765 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:23.765 ------------------------------------------------------------------------------------ 00:06:23.765 0,0 413280/s 1614 MiB/s 0 0 00:06:23.765 ==================================================================================== 00:06:23.765 Total 413280/s 1614 MiB/s 0 0' 00:06:23.765 23:53:12 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:12 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:23.765 23:53:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:23.765 23:53:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.765 23:53:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.765 23:53:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.765 23:53:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.765 23:53:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.765 23:53:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.765 23:53:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.765 23:53:12 -- accel/accel.sh@42 -- # jq -r . 00:06:23.765 [2024-04-25 23:53:12.962990] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:23.765 [2024-04-25 23:53:12.963113] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462434 ] 00:06:23.765 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.765 [2024-04-25 23:53:13.032858] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.765 [2024-04-25 23:53:13.067254] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=0x1 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=0 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=software 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=32 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=32 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=1 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val=Yes 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:23.765 23:53:13 -- accel/accel.sh@21 -- # val= 00:06:23.765 23:53:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # IFS=: 00:06:23.765 23:53:13 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@21 -- # val= 00:06:24.700 23:53:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # IFS=: 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@21 -- # val= 00:06:24.700 23:53:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # IFS=: 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@21 -- # val= 00:06:24.700 23:53:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # IFS=: 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@21 -- # val= 00:06:24.700 23:53:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # IFS=: 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@21 -- # val= 00:06:24.700 23:53:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # IFS=: 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@21 -- # val= 00:06:24.700 23:53:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # IFS=: 00:06:24.700 23:53:14 -- accel/accel.sh@20 -- # read -r var val 00:06:24.700 23:53:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:24.700 23:53:14 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:24.700 23:53:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:24.700 00:06:24.700 real 0m2.581s 00:06:24.700 user 0m2.330s 00:06:24.700 sys 0m0.261s 00:06:24.700 23:53:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:24.700 23:53:14 -- common/autotest_common.sh@10 -- # set +x 00:06:24.700 ************************************ 00:06:24.700 END TEST accel_copy_crc32c 00:06:24.700 ************************************ 00:06:24.700 23:53:14 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:24.700 23:53:14 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:24.700 23:53:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:24.700 23:53:14 -- common/autotest_common.sh@10 -- # set +x 00:06:24.700 ************************************ 00:06:24.700 START TEST accel_copy_crc32c_C2 00:06:24.700 ************************************ 00:06:24.700 23:53:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:24.700 23:53:14 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.700 23:53:14 -- accel/accel.sh@17 -- # local accel_module 00:06:24.700 23:53:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:24.700 23:53:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:24.700 23:53:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.700 23:53:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.700 23:53:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.700 23:53:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.700 23:53:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.700 23:53:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.700 23:53:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.700 23:53:14 -- accel/accel.sh@42 -- # jq -r . 00:06:24.700 [2024-04-25 23:53:14.303926] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:24.701 [2024-04-25 23:53:14.304015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462717 ] 00:06:24.959 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.959 [2024-04-25 23:53:14.374206] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.959 [2024-04-25 23:53:14.409162] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.337 23:53:15 -- accel/accel.sh@18 -- # out=' 00:06:26.337 SPDK Configuration: 00:06:26.337 Core mask: 0x1 00:06:26.337 00:06:26.337 Accel Perf Configuration: 00:06:26.337 Workload Type: copy_crc32c 00:06:26.337 CRC-32C seed: 0 00:06:26.337 Vector size: 4096 bytes 00:06:26.337 Transfer size: 8192 bytes 00:06:26.337 Vector count 2 00:06:26.337 Module: software 00:06:26.337 Queue depth: 32 00:06:26.337 Allocate depth: 32 00:06:26.337 # threads/core: 1 00:06:26.337 Run time: 1 seconds 00:06:26.337 Verify: Yes 00:06:26.337 00:06:26.337 Running for 1 seconds... 00:06:26.337 00:06:26.337 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:26.337 ------------------------------------------------------------------------------------ 00:06:26.337 0,0 295456/s 2308 MiB/s 0 0 00:06:26.337 ==================================================================================== 00:06:26.337 Total 295456/s 1154 MiB/s 0 0' 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:26.337 23:53:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:26.337 23:53:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.337 23:53:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.337 23:53:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.337 23:53:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.337 23:53:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.337 23:53:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.337 23:53:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.337 23:53:15 -- accel/accel.sh@42 -- # jq -r . 00:06:26.337 [2024-04-25 23:53:15.590053] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:26.337 [2024-04-25 23:53:15.590143] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462988 ] 00:06:26.337 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.337 [2024-04-25 23:53:15.658657] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.337 [2024-04-25 23:53:15.692573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=0x1 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=0 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=software 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@23 -- # accel_module=software 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=32 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=32 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=1 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val=Yes 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:26.337 23:53:15 -- accel/accel.sh@21 -- # val= 00:06:26.337 23:53:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # IFS=: 00:06:26.337 23:53:15 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@21 -- # val= 00:06:27.274 23:53:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # IFS=: 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@21 -- # val= 00:06:27.274 23:53:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # IFS=: 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@21 -- # val= 00:06:27.274 23:53:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # IFS=: 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@21 -- # val= 00:06:27.274 23:53:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # IFS=: 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@21 -- # val= 00:06:27.274 23:53:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # IFS=: 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@21 -- # val= 00:06:27.274 23:53:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # IFS=: 00:06:27.274 23:53:16 -- accel/accel.sh@20 -- # read -r var val 00:06:27.274 23:53:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:27.274 23:53:16 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:27.274 23:53:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:27.274 00:06:27.274 real 0m2.577s 00:06:27.274 user 0m2.335s 00:06:27.274 sys 0m0.251s 00:06:27.274 23:53:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.274 23:53:16 -- common/autotest_common.sh@10 -- # set +x 00:06:27.274 ************************************ 00:06:27.274 END TEST accel_copy_crc32c_C2 00:06:27.274 ************************************ 00:06:27.533 23:53:16 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:27.533 23:53:16 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:27.533 23:53:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.533 23:53:16 -- common/autotest_common.sh@10 -- # set +x 00:06:27.533 ************************************ 00:06:27.533 START TEST accel_dualcast 00:06:27.533 ************************************ 00:06:27.533 23:53:16 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:27.533 23:53:16 -- accel/accel.sh@16 -- # local accel_opc 00:06:27.533 23:53:16 -- accel/accel.sh@17 -- # local accel_module 00:06:27.533 23:53:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:27.533 23:53:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:27.533 23:53:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.533 23:53:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:27.533 23:53:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.533 23:53:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.533 23:53:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:27.533 23:53:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:27.533 23:53:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:27.533 23:53:16 -- accel/accel.sh@42 -- # jq -r . 00:06:27.533 [2024-04-25 23:53:16.928795] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:27.533 [2024-04-25 23:53:16.928889] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463269 ] 00:06:27.533 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.533 [2024-04-25 23:53:17.000269] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.533 [2024-04-25 23:53:17.036525] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.910 23:53:18 -- accel/accel.sh@18 -- # out=' 00:06:28.910 SPDK Configuration: 00:06:28.910 Core mask: 0x1 00:06:28.910 00:06:28.910 Accel Perf Configuration: 00:06:28.910 Workload Type: dualcast 00:06:28.910 Transfer size: 4096 bytes 00:06:28.910 Vector count 1 00:06:28.910 Module: software 00:06:28.910 Queue depth: 32 00:06:28.910 Allocate depth: 32 00:06:28.910 # threads/core: 1 00:06:28.910 Run time: 1 seconds 00:06:28.910 Verify: Yes 00:06:28.910 00:06:28.910 Running for 1 seconds... 00:06:28.910 00:06:28.910 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:28.910 ------------------------------------------------------------------------------------ 00:06:28.910 0,0 639680/s 2498 MiB/s 0 0 00:06:28.910 ==================================================================================== 00:06:28.910 Total 639680/s 2498 MiB/s 0 0' 00:06:28.910 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.910 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.910 23:53:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:28.910 23:53:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:28.910 23:53:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.910 23:53:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.910 23:53:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.910 23:53:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.910 23:53:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.910 23:53:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.910 23:53:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.910 23:53:18 -- accel/accel.sh@42 -- # jq -r . 00:06:28.910 [2024-04-25 23:53:18.217310] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:28.910 [2024-04-25 23:53:18.217408] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463427 ] 00:06:28.910 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.910 [2024-04-25 23:53:18.286935] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.910 [2024-04-25 23:53:18.320744] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.910 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.910 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.910 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.910 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.910 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=0x1 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=dualcast 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=software 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@23 -- # accel_module=software 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=32 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=32 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=1 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val=Yes 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:28.911 23:53:18 -- accel/accel.sh@21 -- # val= 00:06:28.911 23:53:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # IFS=: 00:06:28.911 23:53:18 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@21 -- # val= 00:06:30.289 23:53:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # IFS=: 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@21 -- # val= 00:06:30.289 23:53:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # IFS=: 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@21 -- # val= 00:06:30.289 23:53:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # IFS=: 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@21 -- # val= 00:06:30.289 23:53:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # IFS=: 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@21 -- # val= 00:06:30.289 23:53:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # IFS=: 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@21 -- # val= 00:06:30.289 23:53:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # IFS=: 00:06:30.289 23:53:19 -- accel/accel.sh@20 -- # read -r var val 00:06:30.289 23:53:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:30.289 23:53:19 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:30.289 23:53:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:30.289 00:06:30.289 real 0m2.581s 00:06:30.289 user 0m2.325s 00:06:30.289 sys 0m0.264s 00:06:30.289 23:53:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.289 23:53:19 -- common/autotest_common.sh@10 -- # set +x 00:06:30.289 ************************************ 00:06:30.289 END TEST accel_dualcast 00:06:30.289 ************************************ 00:06:30.289 23:53:19 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:30.289 23:53:19 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:30.289 23:53:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:30.289 23:53:19 -- common/autotest_common.sh@10 -- # set +x 00:06:30.289 ************************************ 00:06:30.289 START TEST accel_compare 00:06:30.289 ************************************ 00:06:30.289 23:53:19 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:30.289 23:53:19 -- accel/accel.sh@16 -- # local accel_opc 00:06:30.289 23:53:19 -- accel/accel.sh@17 -- # local accel_module 00:06:30.289 23:53:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:30.289 23:53:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:30.289 23:53:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:30.289 23:53:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.289 23:53:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.289 23:53:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.289 23:53:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.289 23:53:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.289 23:53:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.289 23:53:19 -- accel/accel.sh@42 -- # jq -r . 00:06:30.289 [2024-04-25 23:53:19.557731] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:30.289 [2024-04-25 23:53:19.557811] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463600 ] 00:06:30.289 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.289 [2024-04-25 23:53:19.626214] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.289 [2024-04-25 23:53:19.661470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.225 23:53:20 -- accel/accel.sh@18 -- # out=' 00:06:31.225 SPDK Configuration: 00:06:31.225 Core mask: 0x1 00:06:31.225 00:06:31.225 Accel Perf Configuration: 00:06:31.225 Workload Type: compare 00:06:31.225 Transfer size: 4096 bytes 00:06:31.225 Vector count 1 00:06:31.225 Module: software 00:06:31.225 Queue depth: 32 00:06:31.225 Allocate depth: 32 00:06:31.225 # threads/core: 1 00:06:31.225 Run time: 1 seconds 00:06:31.225 Verify: Yes 00:06:31.225 00:06:31.225 Running for 1 seconds... 00:06:31.225 00:06:31.225 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:31.225 ------------------------------------------------------------------------------------ 00:06:31.225 0,0 793952/s 3101 MiB/s 0 0 00:06:31.225 ==================================================================================== 00:06:31.225 Total 793952/s 3101 MiB/s 0 0' 00:06:31.225 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.225 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.225 23:53:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:31.225 23:53:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:31.225 23:53:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.225 23:53:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.225 23:53:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.225 23:53:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.225 23:53:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.225 23:53:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.225 23:53:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.225 23:53:20 -- accel/accel.sh@42 -- # jq -r . 00:06:31.484 [2024-04-25 23:53:20.845130] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:31.485 [2024-04-25 23:53:20.845218] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463845 ] 00:06:31.485 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.485 [2024-04-25 23:53:20.914131] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.485 [2024-04-25 23:53:20.948297] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val=0x1 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val=compare 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val=software 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@23 -- # accel_module=software 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val=32 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val=32 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val=1 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:20 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:20 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:31.485 23:53:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:21 -- accel/accel.sh@21 -- # val=Yes 00:06:31.485 23:53:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:21 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # read -r var val 00:06:31.485 23:53:21 -- accel/accel.sh@21 -- # val= 00:06:31.485 23:53:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # IFS=: 00:06:31.485 23:53:21 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@21 -- # val= 00:06:32.863 23:53:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # IFS=: 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@21 -- # val= 00:06:32.863 23:53:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # IFS=: 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@21 -- # val= 00:06:32.863 23:53:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # IFS=: 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@21 -- # val= 00:06:32.863 23:53:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # IFS=: 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@21 -- # val= 00:06:32.863 23:53:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # IFS=: 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@21 -- # val= 00:06:32.863 23:53:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # IFS=: 00:06:32.863 23:53:22 -- accel/accel.sh@20 -- # read -r var val 00:06:32.863 23:53:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:32.863 23:53:22 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:32.863 23:53:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:32.863 00:06:32.863 real 0m2.579s 00:06:32.863 user 0m2.333s 00:06:32.863 sys 0m0.256s 00:06:32.863 23:53:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.863 23:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:32.863 ************************************ 00:06:32.863 END TEST accel_compare 00:06:32.863 ************************************ 00:06:32.863 23:53:22 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:32.863 23:53:22 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:32.863 23:53:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.863 23:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:32.863 ************************************ 00:06:32.863 START TEST accel_xor 00:06:32.863 ************************************ 00:06:32.863 23:53:22 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:32.863 23:53:22 -- accel/accel.sh@16 -- # local accel_opc 00:06:32.863 23:53:22 -- accel/accel.sh@17 -- # local accel_module 00:06:32.863 23:53:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:32.863 23:53:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:32.863 23:53:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.863 23:53:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.863 23:53:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.863 23:53:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.863 23:53:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.863 23:53:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.863 23:53:22 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.863 23:53:22 -- accel/accel.sh@42 -- # jq -r . 00:06:32.863 [2024-04-25 23:53:22.185322] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:32.863 [2024-04-25 23:53:22.185443] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid464132 ] 00:06:32.863 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.863 [2024-04-25 23:53:22.256549] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.863 [2024-04-25 23:53:22.293215] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.239 23:53:23 -- accel/accel.sh@18 -- # out=' 00:06:34.239 SPDK Configuration: 00:06:34.239 Core mask: 0x1 00:06:34.239 00:06:34.239 Accel Perf Configuration: 00:06:34.239 Workload Type: xor 00:06:34.239 Source buffers: 2 00:06:34.239 Transfer size: 4096 bytes 00:06:34.239 Vector count 1 00:06:34.239 Module: software 00:06:34.239 Queue depth: 32 00:06:34.239 Allocate depth: 32 00:06:34.239 # threads/core: 1 00:06:34.239 Run time: 1 seconds 00:06:34.239 Verify: Yes 00:06:34.239 00:06:34.239 Running for 1 seconds... 00:06:34.239 00:06:34.239 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:34.239 ------------------------------------------------------------------------------------ 00:06:34.239 0,0 682976/s 2667 MiB/s 0 0 00:06:34.239 ==================================================================================== 00:06:34.239 Total 682976/s 2667 MiB/s 0 0' 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:34.239 23:53:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:34.239 23:53:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.239 23:53:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.239 23:53:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.239 23:53:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.239 23:53:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.239 23:53:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.239 23:53:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.239 23:53:23 -- accel/accel.sh@42 -- # jq -r . 00:06:34.239 [2024-04-25 23:53:23.475560] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:34.239 [2024-04-25 23:53:23.475650] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid464400 ] 00:06:34.239 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.239 [2024-04-25 23:53:23.547231] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.239 [2024-04-25 23:53:23.581122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=0x1 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=xor 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=2 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=software 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=32 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=32 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=1 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val=Yes 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:34.239 23:53:23 -- accel/accel.sh@21 -- # val= 00:06:34.239 23:53:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # IFS=: 00:06:34.239 23:53:23 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@21 -- # val= 00:06:35.174 23:53:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # IFS=: 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@21 -- # val= 00:06:35.174 23:53:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # IFS=: 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@21 -- # val= 00:06:35.174 23:53:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # IFS=: 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@21 -- # val= 00:06:35.174 23:53:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # IFS=: 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@21 -- # val= 00:06:35.174 23:53:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # IFS=: 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@21 -- # val= 00:06:35.174 23:53:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # IFS=: 00:06:35.174 23:53:24 -- accel/accel.sh@20 -- # read -r var val 00:06:35.174 23:53:24 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:35.174 23:53:24 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:35.174 23:53:24 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.174 00:06:35.174 real 0m2.585s 00:06:35.174 user 0m2.326s 00:06:35.174 sys 0m0.267s 00:06:35.174 23:53:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.174 23:53:24 -- common/autotest_common.sh@10 -- # set +x 00:06:35.174 ************************************ 00:06:35.174 END TEST accel_xor 00:06:35.174 ************************************ 00:06:35.433 23:53:24 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:35.433 23:53:24 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:35.433 23:53:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.433 23:53:24 -- common/autotest_common.sh@10 -- # set +x 00:06:35.433 ************************************ 00:06:35.433 START TEST accel_xor 00:06:35.433 ************************************ 00:06:35.433 23:53:24 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:35.433 23:53:24 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.433 23:53:24 -- accel/accel.sh@17 -- # local accel_module 00:06:35.433 23:53:24 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:35.433 23:53:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:35.433 23:53:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.433 23:53:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.433 23:53:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.433 23:53:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.433 23:53:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.433 23:53:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.433 23:53:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.433 23:53:24 -- accel/accel.sh@42 -- # jq -r . 00:06:35.433 [2024-04-25 23:53:24.816747] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:35.433 [2024-04-25 23:53:24.816845] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid464686 ] 00:06:35.433 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.433 [2024-04-25 23:53:24.885826] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.433 [2024-04-25 23:53:24.920885] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.808 23:53:26 -- accel/accel.sh@18 -- # out=' 00:06:36.808 SPDK Configuration: 00:06:36.808 Core mask: 0x1 00:06:36.808 00:06:36.808 Accel Perf Configuration: 00:06:36.808 Workload Type: xor 00:06:36.808 Source buffers: 3 00:06:36.808 Transfer size: 4096 bytes 00:06:36.808 Vector count 1 00:06:36.808 Module: software 00:06:36.808 Queue depth: 32 00:06:36.808 Allocate depth: 32 00:06:36.808 # threads/core: 1 00:06:36.808 Run time: 1 seconds 00:06:36.808 Verify: Yes 00:06:36.808 00:06:36.808 Running for 1 seconds... 00:06:36.808 00:06:36.808 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:36.808 ------------------------------------------------------------------------------------ 00:06:36.808 0,0 648736/s 2534 MiB/s 0 0 00:06:36.808 ==================================================================================== 00:06:36.808 Total 648736/s 2534 MiB/s 0 0' 00:06:36.808 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.808 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.808 23:53:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:36.808 23:53:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:36.808 23:53:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.808 23:53:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.808 23:53:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.808 23:53:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.808 23:53:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.808 23:53:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.808 23:53:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.808 23:53:26 -- accel/accel.sh@42 -- # jq -r . 00:06:36.808 [2024-04-25 23:53:26.102790] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:36.808 [2024-04-25 23:53:26.102879] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid464905 ] 00:06:36.808 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.808 [2024-04-25 23:53:26.174875] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.808 [2024-04-25 23:53:26.209631] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.808 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.808 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.808 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.808 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.808 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.808 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.808 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.808 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=0x1 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=xor 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=3 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=software 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@23 -- # accel_module=software 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=32 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=32 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=1 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val=Yes 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:36.809 23:53:26 -- accel/accel.sh@21 -- # val= 00:06:36.809 23:53:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # IFS=: 00:06:36.809 23:53:26 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@21 -- # val= 00:06:38.184 23:53:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # IFS=: 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@21 -- # val= 00:06:38.184 23:53:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # IFS=: 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@21 -- # val= 00:06:38.184 23:53:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # IFS=: 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@21 -- # val= 00:06:38.184 23:53:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # IFS=: 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@21 -- # val= 00:06:38.184 23:53:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # IFS=: 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@21 -- # val= 00:06:38.184 23:53:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # IFS=: 00:06:38.184 23:53:27 -- accel/accel.sh@20 -- # read -r var val 00:06:38.184 23:53:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:38.184 23:53:27 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:38.184 23:53:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.184 00:06:38.184 real 0m2.580s 00:06:38.184 user 0m2.329s 00:06:38.184 sys 0m0.258s 00:06:38.184 23:53:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.184 23:53:27 -- common/autotest_common.sh@10 -- # set +x 00:06:38.184 ************************************ 00:06:38.184 END TEST accel_xor 00:06:38.184 ************************************ 00:06:38.184 23:53:27 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:38.184 23:53:27 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:38.184 23:53:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.184 23:53:27 -- common/autotest_common.sh@10 -- # set +x 00:06:38.184 ************************************ 00:06:38.184 START TEST accel_dif_verify 00:06:38.184 ************************************ 00:06:38.184 23:53:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:38.184 23:53:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:38.184 23:53:27 -- accel/accel.sh@17 -- # local accel_module 00:06:38.184 23:53:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:38.184 23:53:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:38.184 23:53:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.184 23:53:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.184 23:53:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.184 23:53:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.184 23:53:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.184 23:53:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.184 23:53:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.184 23:53:27 -- accel/accel.sh@42 -- # jq -r . 00:06:38.184 [2024-04-25 23:53:27.445333] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:38.185 [2024-04-25 23:53:27.445422] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465101 ] 00:06:38.185 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.185 [2024-04-25 23:53:27.514234] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.185 [2024-04-25 23:53:27.548918] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.145 23:53:28 -- accel/accel.sh@18 -- # out=' 00:06:39.145 SPDK Configuration: 00:06:39.145 Core mask: 0x1 00:06:39.145 00:06:39.145 Accel Perf Configuration: 00:06:39.145 Workload Type: dif_verify 00:06:39.145 Vector size: 4096 bytes 00:06:39.145 Transfer size: 4096 bytes 00:06:39.145 Block size: 512 bytes 00:06:39.145 Metadata size: 8 bytes 00:06:39.145 Vector count 1 00:06:39.145 Module: software 00:06:39.145 Queue depth: 32 00:06:39.145 Allocate depth: 32 00:06:39.145 # threads/core: 1 00:06:39.145 Run time: 1 seconds 00:06:39.145 Verify: No 00:06:39.145 00:06:39.145 Running for 1 seconds... 00:06:39.145 00:06:39.145 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.145 ------------------------------------------------------------------------------------ 00:06:39.145 0,0 245248/s 972 MiB/s 0 0 00:06:39.145 ==================================================================================== 00:06:39.145 Total 245248/s 958 MiB/s 0 0' 00:06:39.145 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.145 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.145 23:53:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:39.145 23:53:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:39.145 23:53:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.145 23:53:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.145 23:53:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.145 23:53:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.145 23:53:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.145 23:53:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.145 23:53:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.145 23:53:28 -- accel/accel.sh@42 -- # jq -r . 00:06:39.145 [2024-04-25 23:53:28.729999] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:39.145 [2024-04-25 23:53:28.730092] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465264 ] 00:06:39.404 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.404 [2024-04-25 23:53:28.801870] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.404 [2024-04-25 23:53:28.835904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=0x1 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=dif_verify 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=software 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=32 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=32 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=1 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val=No 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:39.404 23:53:28 -- accel/accel.sh@21 -- # val= 00:06:39.404 23:53:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # IFS=: 00:06:39.404 23:53:28 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:29 -- accel/accel.sh@21 -- # val= 00:06:40.780 23:53:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # IFS=: 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:29 -- accel/accel.sh@21 -- # val= 00:06:40.780 23:53:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # IFS=: 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:29 -- accel/accel.sh@21 -- # val= 00:06:40.780 23:53:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # IFS=: 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:29 -- accel/accel.sh@21 -- # val= 00:06:40.780 23:53:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # IFS=: 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:29 -- accel/accel.sh@21 -- # val= 00:06:40.780 23:53:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # IFS=: 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:29 -- accel/accel.sh@21 -- # val= 00:06:40.780 23:53:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.780 23:53:29 -- accel/accel.sh@20 -- # IFS=: 00:06:40.780 23:53:30 -- accel/accel.sh@20 -- # read -r var val 00:06:40.780 23:53:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.780 23:53:30 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:40.780 23:53:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.780 00:06:40.780 real 0m2.577s 00:06:40.780 user 0m2.334s 00:06:40.780 sys 0m0.254s 00:06:40.780 23:53:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.780 23:53:30 -- common/autotest_common.sh@10 -- # set +x 00:06:40.780 ************************************ 00:06:40.780 END TEST accel_dif_verify 00:06:40.780 ************************************ 00:06:40.780 23:53:30 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:40.780 23:53:30 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:40.780 23:53:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:40.780 23:53:30 -- common/autotest_common.sh@10 -- # set +x 00:06:40.780 ************************************ 00:06:40.780 START TEST accel_dif_generate 00:06:40.780 ************************************ 00:06:40.780 23:53:30 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:06:40.780 23:53:30 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.780 23:53:30 -- accel/accel.sh@17 -- # local accel_module 00:06:40.780 23:53:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:40.780 23:53:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:40.780 23:53:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.780 23:53:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.780 23:53:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.780 23:53:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.780 23:53:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.780 23:53:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.780 23:53:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.780 23:53:30 -- accel/accel.sh@42 -- # jq -r . 00:06:40.780 [2024-04-25 23:53:30.067791] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:40.780 [2024-04-25 23:53:30.067883] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465547 ] 00:06:40.780 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.780 [2024-04-25 23:53:30.138692] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.780 [2024-04-25 23:53:30.174129] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.157 23:53:31 -- accel/accel.sh@18 -- # out=' 00:06:42.157 SPDK Configuration: 00:06:42.157 Core mask: 0x1 00:06:42.157 00:06:42.157 Accel Perf Configuration: 00:06:42.157 Workload Type: dif_generate 00:06:42.157 Vector size: 4096 bytes 00:06:42.157 Transfer size: 4096 bytes 00:06:42.157 Block size: 512 bytes 00:06:42.157 Metadata size: 8 bytes 00:06:42.157 Vector count 1 00:06:42.157 Module: software 00:06:42.157 Queue depth: 32 00:06:42.157 Allocate depth: 32 00:06:42.157 # threads/core: 1 00:06:42.157 Run time: 1 seconds 00:06:42.157 Verify: No 00:06:42.157 00:06:42.157 Running for 1 seconds... 00:06:42.157 00:06:42.157 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.157 ------------------------------------------------------------------------------------ 00:06:42.157 0,0 281376/s 1116 MiB/s 0 0 00:06:42.157 ==================================================================================== 00:06:42.157 Total 281376/s 1099 MiB/s 0 0' 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:42.157 23:53:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:42.157 23:53:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.157 23:53:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.157 23:53:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.157 23:53:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.157 23:53:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.157 23:53:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.157 23:53:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.157 23:53:31 -- accel/accel.sh@42 -- # jq -r . 00:06:42.157 [2024-04-25 23:53:31.353336] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:42.157 [2024-04-25 23:53:31.353439] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465819 ] 00:06:42.157 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.157 [2024-04-25 23:53:31.420976] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.157 [2024-04-25 23:53:31.454667] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val=0x1 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val=dif_generate 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.157 23:53:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.157 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.157 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val=software 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val=32 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val=32 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val=1 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val=No 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:42.158 23:53:31 -- accel/accel.sh@21 -- # val= 00:06:42.158 23:53:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # IFS=: 00:06:42.158 23:53:31 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@21 -- # val= 00:06:43.095 23:53:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # IFS=: 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@21 -- # val= 00:06:43.095 23:53:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # IFS=: 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@21 -- # val= 00:06:43.095 23:53:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # IFS=: 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@21 -- # val= 00:06:43.095 23:53:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # IFS=: 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@21 -- # val= 00:06:43.095 23:53:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # IFS=: 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@21 -- # val= 00:06:43.095 23:53:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # IFS=: 00:06:43.095 23:53:32 -- accel/accel.sh@20 -- # read -r var val 00:06:43.095 23:53:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.095 23:53:32 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:43.095 23:53:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.095 00:06:43.095 real 0m2.572s 00:06:43.095 user 0m2.317s 00:06:43.095 sys 0m0.265s 00:06:43.095 23:53:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.095 23:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:43.095 ************************************ 00:06:43.095 END TEST accel_dif_generate 00:06:43.095 ************************************ 00:06:43.095 23:53:32 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:43.095 23:53:32 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:43.095 23:53:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.095 23:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:43.095 ************************************ 00:06:43.095 START TEST accel_dif_generate_copy 00:06:43.095 ************************************ 00:06:43.095 23:53:32 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:06:43.095 23:53:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.095 23:53:32 -- accel/accel.sh@17 -- # local accel_module 00:06:43.095 23:53:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:43.095 23:53:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:43.095 23:53:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.095 23:53:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.095 23:53:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.095 23:53:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.095 23:53:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.095 23:53:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.095 23:53:32 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.095 23:53:32 -- accel/accel.sh@42 -- # jq -r . 00:06:43.095 [2024-04-25 23:53:32.687802] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:43.095 [2024-04-25 23:53:32.687898] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid466102 ] 00:06:43.353 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.353 [2024-04-25 23:53:32.756527] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.353 [2024-04-25 23:53:32.791681] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.728 23:53:33 -- accel/accel.sh@18 -- # out=' 00:06:44.728 SPDK Configuration: 00:06:44.728 Core mask: 0x1 00:06:44.728 00:06:44.728 Accel Perf Configuration: 00:06:44.728 Workload Type: dif_generate_copy 00:06:44.728 Vector size: 4096 bytes 00:06:44.728 Transfer size: 4096 bytes 00:06:44.728 Vector count 1 00:06:44.728 Module: software 00:06:44.728 Queue depth: 32 00:06:44.728 Allocate depth: 32 00:06:44.728 # threads/core: 1 00:06:44.728 Run time: 1 seconds 00:06:44.728 Verify: No 00:06:44.728 00:06:44.728 Running for 1 seconds... 00:06:44.728 00:06:44.728 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.728 ------------------------------------------------------------------------------------ 00:06:44.728 0,0 221440/s 878 MiB/s 0 0 00:06:44.728 ==================================================================================== 00:06:44.728 Total 221440/s 865 MiB/s 0 0' 00:06:44.728 23:53:33 -- accel/accel.sh@20 -- # IFS=: 00:06:44.728 23:53:33 -- accel/accel.sh@20 -- # read -r var val 00:06:44.728 23:53:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:44.728 23:53:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:44.728 23:53:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.728 23:53:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.728 23:53:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.728 23:53:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.728 23:53:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.728 23:53:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.728 23:53:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.728 23:53:33 -- accel/accel.sh@42 -- # jq -r . 00:06:44.728 [2024-04-25 23:53:33.970350] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:44.728 [2024-04-25 23:53:33.970448] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid466373 ] 00:06:44.728 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.728 [2024-04-25 23:53:34.039375] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.728 [2024-04-25 23:53:34.073250] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.728 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.728 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.728 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.728 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.728 23:53:34 -- accel/accel.sh@21 -- # val=0x1 00:06:44.728 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.728 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.728 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.728 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.728 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.728 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val=software 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val=32 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val=32 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val=1 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val=No 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:44.729 23:53:34 -- accel/accel.sh@21 -- # val= 00:06:44.729 23:53:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # IFS=: 00:06:44.729 23:53:34 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@21 -- # val= 00:06:45.667 23:53:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # IFS=: 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@21 -- # val= 00:06:45.667 23:53:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # IFS=: 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@21 -- # val= 00:06:45.667 23:53:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # IFS=: 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@21 -- # val= 00:06:45.667 23:53:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # IFS=: 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@21 -- # val= 00:06:45.667 23:53:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # IFS=: 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@21 -- # val= 00:06:45.667 23:53:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # IFS=: 00:06:45.667 23:53:35 -- accel/accel.sh@20 -- # read -r var val 00:06:45.667 23:53:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.667 23:53:35 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:45.667 23:53:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.667 00:06:45.667 real 0m2.571s 00:06:45.667 user 0m2.328s 00:06:45.667 sys 0m0.252s 00:06:45.667 23:53:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.667 23:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:45.667 ************************************ 00:06:45.667 END TEST accel_dif_generate_copy 00:06:45.667 ************************************ 00:06:45.927 23:53:35 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:45.927 23:53:35 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.927 23:53:35 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:45.927 23:53:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.927 23:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:45.927 ************************************ 00:06:45.927 START TEST accel_comp 00:06:45.927 ************************************ 00:06:45.927 23:53:35 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.927 23:53:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.927 23:53:35 -- accel/accel.sh@17 -- # local accel_module 00:06:45.927 23:53:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.927 23:53:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.927 23:53:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.927 23:53:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.927 23:53:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.927 23:53:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.927 23:53:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.927 23:53:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.927 23:53:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.927 23:53:35 -- accel/accel.sh@42 -- # jq -r . 00:06:45.927 [2024-04-25 23:53:35.309438] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:45.927 [2024-04-25 23:53:35.309519] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid466605 ] 00:06:45.927 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.927 [2024-04-25 23:53:35.378773] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.927 [2024-04-25 23:53:35.414186] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.306 23:53:36 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:47.306 00:06:47.306 SPDK Configuration: 00:06:47.306 Core mask: 0x1 00:06:47.306 00:06:47.306 Accel Perf Configuration: 00:06:47.306 Workload Type: compress 00:06:47.306 Transfer size: 4096 bytes 00:06:47.306 Vector count 1 00:06:47.306 Module: software 00:06:47.306 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.306 Queue depth: 32 00:06:47.306 Allocate depth: 32 00:06:47.306 # threads/core: 1 00:06:47.306 Run time: 1 seconds 00:06:47.306 Verify: No 00:06:47.306 00:06:47.306 Running for 1 seconds... 00:06:47.306 00:06:47.306 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.306 ------------------------------------------------------------------------------------ 00:06:47.306 0,0 68096/s 283 MiB/s 0 0 00:06:47.306 ==================================================================================== 00:06:47.306 Total 68096/s 266 MiB/s 0 0' 00:06:47.306 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.306 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.306 23:53:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.306 23:53:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.306 23:53:36 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.306 23:53:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.306 23:53:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.306 23:53:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.306 23:53:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.306 23:53:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.306 23:53:36 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.306 23:53:36 -- accel/accel.sh@42 -- # jq -r . 00:06:47.306 [2024-04-25 23:53:36.596792] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:47.306 [2024-04-25 23:53:36.596883] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid466754 ] 00:06:47.306 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.306 [2024-04-25 23:53:36.666964] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.306 [2024-04-25 23:53:36.700896] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.306 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.306 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=0x1 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=compress 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=software 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=32 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=32 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=1 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val=No 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:47.307 23:53:36 -- accel/accel.sh@21 -- # val= 00:06:47.307 23:53:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # IFS=: 00:06:47.307 23:53:36 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@21 -- # val= 00:06:48.686 23:53:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # IFS=: 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@21 -- # val= 00:06:48.686 23:53:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # IFS=: 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@21 -- # val= 00:06:48.686 23:53:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # IFS=: 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@21 -- # val= 00:06:48.686 23:53:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # IFS=: 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@21 -- # val= 00:06:48.686 23:53:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # IFS=: 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@21 -- # val= 00:06:48.686 23:53:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # IFS=: 00:06:48.686 23:53:37 -- accel/accel.sh@20 -- # read -r var val 00:06:48.686 23:53:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:48.686 23:53:37 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:48.686 23:53:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.686 00:06:48.686 real 0m2.581s 00:06:48.686 user 0m2.332s 00:06:48.686 sys 0m0.258s 00:06:48.686 23:53:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.686 23:53:37 -- common/autotest_common.sh@10 -- # set +x 00:06:48.686 ************************************ 00:06:48.686 END TEST accel_comp 00:06:48.686 ************************************ 00:06:48.686 23:53:37 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:48.686 23:53:37 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:48.686 23:53:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:48.686 23:53:37 -- common/autotest_common.sh@10 -- # set +x 00:06:48.686 ************************************ 00:06:48.686 START TEST accel_decomp 00:06:48.686 ************************************ 00:06:48.686 23:53:37 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:48.686 23:53:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.686 23:53:37 -- accel/accel.sh@17 -- # local accel_module 00:06:48.686 23:53:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:48.686 23:53:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:48.686 23:53:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.686 23:53:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.686 23:53:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.686 23:53:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.686 23:53:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.686 23:53:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.686 23:53:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.686 23:53:37 -- accel/accel.sh@42 -- # jq -r . 00:06:48.686 [2024-04-25 23:53:37.938334] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:48.686 [2024-04-25 23:53:37.938504] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid466963 ] 00:06:48.686 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.686 [2024-04-25 23:53:38.009729] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.686 [2024-04-25 23:53:38.045041] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.622 23:53:39 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:49.622 00:06:49.622 SPDK Configuration: 00:06:49.622 Core mask: 0x1 00:06:49.622 00:06:49.622 Accel Perf Configuration: 00:06:49.622 Workload Type: decompress 00:06:49.622 Transfer size: 4096 bytes 00:06:49.622 Vector count 1 00:06:49.622 Module: software 00:06:49.622 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:49.622 Queue depth: 32 00:06:49.622 Allocate depth: 32 00:06:49.622 # threads/core: 1 00:06:49.622 Run time: 1 seconds 00:06:49.622 Verify: Yes 00:06:49.622 00:06:49.622 Running for 1 seconds... 00:06:49.622 00:06:49.622 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.622 ------------------------------------------------------------------------------------ 00:06:49.622 0,0 93728/s 172 MiB/s 0 0 00:06:49.622 ==================================================================================== 00:06:49.622 Total 93728/s 366 MiB/s 0 0' 00:06:49.622 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.623 23:53:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:49.623 23:53:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:49.623 23:53:39 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.623 23:53:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.623 23:53:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.623 23:53:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.623 23:53:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.623 23:53:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.623 23:53:39 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.623 23:53:39 -- accel/accel.sh@42 -- # jq -r . 00:06:49.623 [2024-04-25 23:53:39.227161] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:49.623 [2024-04-25 23:53:39.227252] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467231 ] 00:06:49.882 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.882 [2024-04-25 23:53:39.295906] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.882 [2024-04-25 23:53:39.330137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=0x1 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=decompress 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=software 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=32 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=32 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=1 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val=Yes 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:49.882 23:53:39 -- accel/accel.sh@21 -- # val= 00:06:49.882 23:53:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # IFS=: 00:06:49.882 23:53:39 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@21 -- # val= 00:06:51.259 23:53:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # IFS=: 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@21 -- # val= 00:06:51.259 23:53:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # IFS=: 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@21 -- # val= 00:06:51.259 23:53:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # IFS=: 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@21 -- # val= 00:06:51.259 23:53:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # IFS=: 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@21 -- # val= 00:06:51.259 23:53:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # IFS=: 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@21 -- # val= 00:06:51.259 23:53:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # IFS=: 00:06:51.259 23:53:40 -- accel/accel.sh@20 -- # read -r var val 00:06:51.259 23:53:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.259 23:53:40 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:51.259 23:53:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.259 00:06:51.259 real 0m2.584s 00:06:51.259 user 0m2.325s 00:06:51.259 sys 0m0.268s 00:06:51.259 23:53:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.259 23:53:40 -- common/autotest_common.sh@10 -- # set +x 00:06:51.259 ************************************ 00:06:51.259 END TEST accel_decomp 00:06:51.259 ************************************ 00:06:51.259 23:53:40 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:51.259 23:53:40 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:51.259 23:53:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:51.259 23:53:40 -- common/autotest_common.sh@10 -- # set +x 00:06:51.259 ************************************ 00:06:51.259 START TEST accel_decmop_full 00:06:51.259 ************************************ 00:06:51.259 23:53:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:51.259 23:53:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.259 23:53:40 -- accel/accel.sh@17 -- # local accel_module 00:06:51.259 23:53:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:51.259 23:53:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:51.259 23:53:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.259 23:53:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.259 23:53:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.259 23:53:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.259 23:53:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.259 23:53:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.259 23:53:40 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.259 23:53:40 -- accel/accel.sh@42 -- # jq -r . 00:06:51.259 [2024-04-25 23:53:40.569896] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:51.259 [2024-04-25 23:53:40.569987] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467518 ] 00:06:51.259 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.259 [2024-04-25 23:53:40.641449] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.259 [2024-04-25 23:53:40.676831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.634 23:53:41 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:52.635 00:06:52.635 SPDK Configuration: 00:06:52.635 Core mask: 0x1 00:06:52.635 00:06:52.635 Accel Perf Configuration: 00:06:52.635 Workload Type: decompress 00:06:52.635 Transfer size: 111250 bytes 00:06:52.635 Vector count 1 00:06:52.635 Module: software 00:06:52.635 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:52.635 Queue depth: 32 00:06:52.635 Allocate depth: 32 00:06:52.635 # threads/core: 1 00:06:52.635 Run time: 1 seconds 00:06:52.635 Verify: Yes 00:06:52.635 00:06:52.635 Running for 1 seconds... 00:06:52.635 00:06:52.635 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.635 ------------------------------------------------------------------------------------ 00:06:52.635 0,0 5824/s 240 MiB/s 0 0 00:06:52.635 ==================================================================================== 00:06:52.635 Total 5824/s 617 MiB/s 0 0' 00:06:52.635 23:53:41 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:41 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:52.635 23:53:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:52.635 23:53:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.635 23:53:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.635 23:53:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.635 23:53:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.635 23:53:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.635 23:53:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.635 23:53:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.635 23:53:41 -- accel/accel.sh@42 -- # jq -r . 00:06:52.635 [2024-04-25 23:53:41.869871] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:52.635 [2024-04-25 23:53:41.869955] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467790 ] 00:06:52.635 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.635 [2024-04-25 23:53:41.938413] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.635 [2024-04-25 23:53:41.972466] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=0x1 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=decompress 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=software 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=32 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=32 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=1 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val=Yes 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:52.635 23:53:42 -- accel/accel.sh@21 -- # val= 00:06:52.635 23:53:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # IFS=: 00:06:52.635 23:53:42 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@21 -- # val= 00:06:53.572 23:53:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # IFS=: 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@21 -- # val= 00:06:53.572 23:53:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # IFS=: 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@21 -- # val= 00:06:53.572 23:53:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # IFS=: 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@21 -- # val= 00:06:53.572 23:53:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # IFS=: 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@21 -- # val= 00:06:53.572 23:53:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # IFS=: 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@21 -- # val= 00:06:53.572 23:53:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # IFS=: 00:06:53.572 23:53:43 -- accel/accel.sh@20 -- # read -r var val 00:06:53.572 23:53:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:53.572 23:53:43 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:53.572 23:53:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.572 00:06:53.572 real 0m2.602s 00:06:53.572 user 0m2.358s 00:06:53.572 sys 0m0.253s 00:06:53.572 23:53:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.572 23:53:43 -- common/autotest_common.sh@10 -- # set +x 00:06:53.572 ************************************ 00:06:53.572 END TEST accel_decmop_full 00:06:53.572 ************************************ 00:06:53.831 23:53:43 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:53.831 23:53:43 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:53.831 23:53:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:53.831 23:53:43 -- common/autotest_common.sh@10 -- # set +x 00:06:53.831 ************************************ 00:06:53.831 START TEST accel_decomp_mcore 00:06:53.831 ************************************ 00:06:53.831 23:53:43 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:53.831 23:53:43 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.831 23:53:43 -- accel/accel.sh@17 -- # local accel_module 00:06:53.831 23:53:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:53.831 23:53:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:53.831 23:53:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.831 23:53:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.831 23:53:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.831 23:53:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.831 23:53:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.831 23:53:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.831 23:53:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.831 23:53:43 -- accel/accel.sh@42 -- # jq -r . 00:06:53.831 [2024-04-25 23:53:43.219677] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:53.831 [2024-04-25 23:53:43.219772] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468072 ] 00:06:53.831 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.831 [2024-04-25 23:53:43.290979] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:53.831 [2024-04-25 23:53:43.329757] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:53.831 [2024-04-25 23:53:43.329852] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:53.831 [2024-04-25 23:53:43.329913] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:53.831 [2024-04-25 23:53:43.329915] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.207 23:53:44 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:55.207 00:06:55.207 SPDK Configuration: 00:06:55.207 Core mask: 0xf 00:06:55.207 00:06:55.207 Accel Perf Configuration: 00:06:55.207 Workload Type: decompress 00:06:55.207 Transfer size: 4096 bytes 00:06:55.207 Vector count 1 00:06:55.207 Module: software 00:06:55.207 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:55.207 Queue depth: 32 00:06:55.207 Allocate depth: 32 00:06:55.207 # threads/core: 1 00:06:55.207 Run time: 1 seconds 00:06:55.207 Verify: Yes 00:06:55.207 00:06:55.207 Running for 1 seconds... 00:06:55.207 00:06:55.207 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.207 ------------------------------------------------------------------------------------ 00:06:55.207 0,0 77408/s 142 MiB/s 0 0 00:06:55.207 3,0 78400/s 144 MiB/s 0 0 00:06:55.207 2,0 77920/s 143 MiB/s 0 0 00:06:55.207 1,0 77920/s 143 MiB/s 0 0 00:06:55.207 ==================================================================================== 00:06:55.207 Total 311648/s 1217 MiB/s 0 0' 00:06:55.207 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.207 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.207 23:53:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:55.207 23:53:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:55.207 23:53:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.207 23:53:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.207 23:53:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.207 23:53:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.207 23:53:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.207 23:53:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.207 23:53:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.207 23:53:44 -- accel/accel.sh@42 -- # jq -r . 00:06:55.207 [2024-04-25 23:53:44.521871] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:55.207 [2024-04-25 23:53:44.521960] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468261 ] 00:06:55.207 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.208 [2024-04-25 23:53:44.593089] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:55.208 [2024-04-25 23:53:44.630027] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.208 [2024-04-25 23:53:44.630046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:55.208 [2024-04-25 23:53:44.630135] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:55.208 [2024-04-25 23:53:44.630137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=0xf 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=decompress 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=software 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=32 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=32 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=1 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val=Yes 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:55.208 23:53:44 -- accel/accel.sh@21 -- # val= 00:06:55.208 23:53:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # IFS=: 00:06:55.208 23:53:44 -- accel/accel.sh@20 -- # read -r var val 00:06:56.689 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.689 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.689 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.689 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.689 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.689 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.689 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.689 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.689 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.689 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.689 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.689 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.690 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.690 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.690 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.690 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.690 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.690 23:53:45 -- accel/accel.sh@21 -- # val= 00:06:56.690 23:53:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # IFS=: 00:06:56.690 23:53:45 -- accel/accel.sh@20 -- # read -r var val 00:06:56.690 23:53:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.690 23:53:45 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:56.690 23:53:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.690 00:06:56.690 real 0m2.608s 00:06:56.690 user 0m4.507s 00:06:56.690 sys 0m0.144s 00:06:56.690 23:53:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.690 23:53:45 -- common/autotest_common.sh@10 -- # set +x 00:06:56.690 ************************************ 00:06:56.690 END TEST accel_decomp_mcore 00:06:56.690 ************************************ 00:06:56.690 23:53:45 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:56.690 23:53:45 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:56.690 23:53:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.690 23:53:45 -- common/autotest_common.sh@10 -- # set +x 00:06:56.690 ************************************ 00:06:56.690 START TEST accel_decomp_full_mcore 00:06:56.690 ************************************ 00:06:56.690 23:53:45 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:56.690 23:53:45 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.690 23:53:45 -- accel/accel.sh@17 -- # local accel_module 00:06:56.690 23:53:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:56.690 23:53:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:56.690 23:53:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.690 23:53:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.690 23:53:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.690 23:53:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.690 23:53:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.690 23:53:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.690 23:53:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.690 23:53:45 -- accel/accel.sh@42 -- # jq -r . 00:06:56.690 [2024-04-25 23:53:45.874780] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:56.690 [2024-04-25 23:53:45.874890] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468467 ] 00:06:56.690 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.690 [2024-04-25 23:53:45.944274] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:56.690 [2024-04-25 23:53:45.981965] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.690 [2024-04-25 23:53:45.982056] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:56.690 [2024-04-25 23:53:45.982142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:56.690 [2024-04-25 23:53:45.982144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.781 23:53:47 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:57.781 00:06:57.781 SPDK Configuration: 00:06:57.781 Core mask: 0xf 00:06:57.781 00:06:57.781 Accel Perf Configuration: 00:06:57.781 Workload Type: decompress 00:06:57.781 Transfer size: 111250 bytes 00:06:57.781 Vector count 1 00:06:57.781 Module: software 00:06:57.781 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:57.781 Queue depth: 32 00:06:57.781 Allocate depth: 32 00:06:57.781 # threads/core: 1 00:06:57.781 Run time: 1 seconds 00:06:57.781 Verify: Yes 00:06:57.781 00:06:57.781 Running for 1 seconds... 00:06:57.781 00:06:57.781 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:57.781 ------------------------------------------------------------------------------------ 00:06:57.781 0,0 5792/s 239 MiB/s 0 0 00:06:57.781 3,0 5824/s 240 MiB/s 0 0 00:06:57.781 2,0 5824/s 240 MiB/s 0 0 00:06:57.781 1,0 5824/s 240 MiB/s 0 0 00:06:57.781 ==================================================================================== 00:06:57.781 Total 23264/s 2468 MiB/s 0 0' 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:57.781 23:53:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:57.781 23:53:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.781 23:53:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.781 23:53:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.781 23:53:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.781 23:53:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.781 23:53:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.781 23:53:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.781 23:53:47 -- accel/accel.sh@42 -- # jq -r . 00:06:57.781 [2024-04-25 23:53:47.182604] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:57.781 [2024-04-25 23:53:47.182695] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468663 ] 00:06:57.781 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.781 [2024-04-25 23:53:47.254000] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:57.781 [2024-04-25 23:53:47.291211] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:57.781 [2024-04-25 23:53:47.291307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:57.781 [2024-04-25 23:53:47.291388] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.781 [2024-04-25 23:53:47.291390] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=0xf 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=decompress 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=software 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@23 -- # accel_module=software 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=32 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=32 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=1 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val=Yes 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:57.781 23:53:47 -- accel/accel.sh@21 -- # val= 00:06:57.781 23:53:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # IFS=: 00:06:57.781 23:53:47 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@21 -- # val= 00:06:59.159 23:53:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # IFS=: 00:06:59.159 23:53:48 -- accel/accel.sh@20 -- # read -r var val 00:06:59.159 23:53:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:59.159 23:53:48 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:59.159 23:53:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.159 00:06:59.159 real 0m2.626s 00:06:59.159 user 0m9.065s 00:06:59.159 sys 0m0.271s 00:06:59.159 23:53:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.159 23:53:48 -- common/autotest_common.sh@10 -- # set +x 00:06:59.159 ************************************ 00:06:59.159 END TEST accel_decomp_full_mcore 00:06:59.159 ************************************ 00:06:59.160 23:53:48 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.160 23:53:48 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:59.160 23:53:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:59.160 23:53:48 -- common/autotest_common.sh@10 -- # set +x 00:06:59.160 ************************************ 00:06:59.160 START TEST accel_decomp_mthread 00:06:59.160 ************************************ 00:06:59.160 23:53:48 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.160 23:53:48 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.160 23:53:48 -- accel/accel.sh@17 -- # local accel_module 00:06:59.160 23:53:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.160 23:53:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.160 23:53:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.160 23:53:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.160 23:53:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.160 23:53:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.160 23:53:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.160 23:53:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.160 23:53:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.160 23:53:48 -- accel/accel.sh@42 -- # jq -r . 00:06:59.160 [2024-04-25 23:53:48.548812] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:06:59.160 [2024-04-25 23:53:48.548903] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468950 ] 00:06:59.160 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.160 [2024-04-25 23:53:48.619467] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.160 [2024-04-25 23:53:48.654665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.537 23:53:49 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:00.537 00:07:00.537 SPDK Configuration: 00:07:00.537 Core mask: 0x1 00:07:00.537 00:07:00.537 Accel Perf Configuration: 00:07:00.537 Workload Type: decompress 00:07:00.537 Transfer size: 4096 bytes 00:07:00.537 Vector count 1 00:07:00.537 Module: software 00:07:00.537 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:00.537 Queue depth: 32 00:07:00.537 Allocate depth: 32 00:07:00.537 # threads/core: 2 00:07:00.537 Run time: 1 seconds 00:07:00.537 Verify: Yes 00:07:00.537 00:07:00.537 Running for 1 seconds... 00:07:00.537 00:07:00.537 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.537 ------------------------------------------------------------------------------------ 00:07:00.537 0,1 47456/s 87 MiB/s 0 0 00:07:00.537 0,0 47360/s 87 MiB/s 0 0 00:07:00.537 ==================================================================================== 00:07:00.538 Total 94816/s 370 MiB/s 0 0' 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:00.538 23:53:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:00.538 23:53:49 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.538 23:53:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.538 23:53:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.538 23:53:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.538 23:53:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.538 23:53:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.538 23:53:49 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.538 23:53:49 -- accel/accel.sh@42 -- # jq -r . 00:07:00.538 [2024-04-25 23:53:49.841691] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:00.538 [2024-04-25 23:53:49.841782] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469224 ] 00:07:00.538 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.538 [2024-04-25 23:53:49.912153] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.538 [2024-04-25 23:53:49.946046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val=0x1 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val=decompress 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val=software 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:49 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:49 -- accel/accel.sh@21 -- # val=32 00:07:00.538 23:53:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:50 -- accel/accel.sh@21 -- # val=32 00:07:00.538 23:53:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:50 -- accel/accel.sh@21 -- # val=2 00:07:00.538 23:53:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.538 23:53:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:50 -- accel/accel.sh@21 -- # val=Yes 00:07:00.538 23:53:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:50 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:00.538 23:53:50 -- accel/accel.sh@21 -- # val= 00:07:00.538 23:53:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # IFS=: 00:07:00.538 23:53:50 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@21 -- # val= 00:07:01.916 23:53:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # IFS=: 00:07:01.916 23:53:51 -- accel/accel.sh@20 -- # read -r var val 00:07:01.916 23:53:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.916 23:53:51 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:01.916 23:53:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.916 00:07:01.916 real 0m2.591s 00:07:01.916 user 0m2.334s 00:07:01.916 sys 0m0.267s 00:07:01.916 23:53:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.916 23:53:51 -- common/autotest_common.sh@10 -- # set +x 00:07:01.916 ************************************ 00:07:01.916 END TEST accel_decomp_mthread 00:07:01.916 ************************************ 00:07:01.916 23:53:51 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.916 23:53:51 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:01.916 23:53:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:01.916 23:53:51 -- common/autotest_common.sh@10 -- # set +x 00:07:01.916 ************************************ 00:07:01.916 START TEST accel_deomp_full_mthread 00:07:01.916 ************************************ 00:07:01.916 23:53:51 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.916 23:53:51 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.916 23:53:51 -- accel/accel.sh@17 -- # local accel_module 00:07:01.916 23:53:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.916 23:53:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:01.916 23:53:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.916 23:53:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.916 23:53:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.916 23:53:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.916 23:53:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.916 23:53:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.916 23:53:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.916 23:53:51 -- accel/accel.sh@42 -- # jq -r . 00:07:01.916 [2024-04-25 23:53:51.187713] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:01.916 [2024-04-25 23:53:51.187804] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469510 ] 00:07:01.916 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.916 [2024-04-25 23:53:51.258434] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.916 [2024-04-25 23:53:51.293702] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.293 23:53:52 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:03.293 00:07:03.293 SPDK Configuration: 00:07:03.293 Core mask: 0x1 00:07:03.293 00:07:03.293 Accel Perf Configuration: 00:07:03.293 Workload Type: decompress 00:07:03.293 Transfer size: 111250 bytes 00:07:03.293 Vector count 1 00:07:03.293 Module: software 00:07:03.293 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:03.293 Queue depth: 32 00:07:03.293 Allocate depth: 32 00:07:03.293 # threads/core: 2 00:07:03.293 Run time: 1 seconds 00:07:03.293 Verify: Yes 00:07:03.293 00:07:03.293 Running for 1 seconds... 00:07:03.293 00:07:03.293 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.293 ------------------------------------------------------------------------------------ 00:07:03.293 0,1 2976/s 122 MiB/s 0 0 00:07:03.293 0,0 2944/s 121 MiB/s 0 0 00:07:03.293 ==================================================================================== 00:07:03.293 Total 5920/s 628 MiB/s 0 0' 00:07:03.293 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.293 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.293 23:53:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:03.293 23:53:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:03.293 23:53:52 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.293 23:53:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.293 23:53:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.293 23:53:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.293 23:53:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.293 23:53:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.293 23:53:52 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.293 23:53:52 -- accel/accel.sh@42 -- # jq -r . 00:07:03.293 [2024-04-25 23:53:52.498211] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:03.293 [2024-04-25 23:53:52.498296] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469776 ] 00:07:03.293 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.293 [2024-04-25 23:53:52.566875] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.293 [2024-04-25 23:53:52.601046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.293 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.293 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.293 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.293 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.293 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=0x1 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=decompress 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=software 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=32 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=32 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=2 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val=Yes 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:03.294 23:53:52 -- accel/accel.sh@21 -- # val= 00:07:03.294 23:53:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # IFS=: 00:07:03.294 23:53:52 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@21 -- # val= 00:07:04.227 23:53:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # IFS=: 00:07:04.227 23:53:53 -- accel/accel.sh@20 -- # read -r var val 00:07:04.227 23:53:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.227 23:53:53 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:04.227 23:53:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.227 00:07:04.227 real 0m2.627s 00:07:04.227 user 0m2.384s 00:07:04.227 sys 0m0.251s 00:07:04.227 23:53:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.227 23:53:53 -- common/autotest_common.sh@10 -- # set +x 00:07:04.227 ************************************ 00:07:04.227 END TEST accel_deomp_full_mthread 00:07:04.227 ************************************ 00:07:04.227 23:53:53 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:04.227 23:53:53 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:04.227 23:53:53 -- accel/accel.sh@129 -- # build_accel_config 00:07:04.227 23:53:53 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:04.227 23:53:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:04.227 23:53:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.227 23:53:53 -- common/autotest_common.sh@10 -- # set +x 00:07:04.227 23:53:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.227 23:53:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.486 23:53:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.486 23:53:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.486 23:53:53 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.486 23:53:53 -- accel/accel.sh@42 -- # jq -r . 00:07:04.486 ************************************ 00:07:04.486 START TEST accel_dif_functional_tests 00:07:04.486 ************************************ 00:07:04.486 23:53:53 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:04.486 [2024-04-25 23:53:53.867714] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:04.486 [2024-04-25 23:53:53.867804] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470047 ] 00:07:04.486 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.486 [2024-04-25 23:53:53.938235] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:04.486 [2024-04-25 23:53:53.975052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.486 [2024-04-25 23:53:53.975147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.486 [2024-04-25 23:53:53.975148] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:04.486 00:07:04.486 00:07:04.486 CUnit - A unit testing framework for C - Version 2.1-3 00:07:04.486 http://cunit.sourceforge.net/ 00:07:04.486 00:07:04.486 00:07:04.486 Suite: accel_dif 00:07:04.486 Test: verify: DIF generated, GUARD check ...passed 00:07:04.486 Test: verify: DIF generated, APPTAG check ...passed 00:07:04.486 Test: verify: DIF generated, REFTAG check ...passed 00:07:04.486 Test: verify: DIF not generated, GUARD check ...[2024-04-25 23:53:54.037298] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:04.486 [2024-04-25 23:53:54.037345] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:04.486 passed 00:07:04.486 Test: verify: DIF not generated, APPTAG check ...[2024-04-25 23:53:54.037378] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:04.486 [2024-04-25 23:53:54.037400] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:04.486 passed 00:07:04.486 Test: verify: DIF not generated, REFTAG check ...[2024-04-25 23:53:54.037420] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:04.486 [2024-04-25 23:53:54.037444] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:04.486 passed 00:07:04.486 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:04.486 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-25 23:53:54.037487] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:04.486 passed 00:07:04.486 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:04.486 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:04.486 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:04.486 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-25 23:53:54.037581] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:04.486 passed 00:07:04.486 Test: generate copy: DIF generated, GUARD check ...passed 00:07:04.486 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:04.486 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:04.486 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:04.486 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:04.486 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:04.486 Test: generate copy: iovecs-len validate ...[2024-04-25 23:53:54.037745] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:04.486 passed 00:07:04.486 Test: generate copy: buffer alignment validate ...passed 00:07:04.486 00:07:04.486 Run Summary: Type Total Ran Passed Failed Inactive 00:07:04.486 suites 1 1 n/a 0 0 00:07:04.486 tests 20 20 20 0 0 00:07:04.486 asserts 204 204 204 0 n/a 00:07:04.486 00:07:04.486 Elapsed time = 0.002 seconds 00:07:04.745 00:07:04.745 real 0m0.345s 00:07:04.745 user 0m0.527s 00:07:04.745 sys 0m0.159s 00:07:04.745 23:53:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.745 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:04.745 ************************************ 00:07:04.745 END TEST accel_dif_functional_tests 00:07:04.745 ************************************ 00:07:04.745 00:07:04.745 real 0m55.309s 00:07:04.745 user 1m2.922s 00:07:04.745 sys 0m7.117s 00:07:04.745 23:53:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.745 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:04.745 ************************************ 00:07:04.745 END TEST accel 00:07:04.745 ************************************ 00:07:04.745 23:53:54 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:04.745 23:53:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:04.745 23:53:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:04.745 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:04.745 ************************************ 00:07:04.745 START TEST accel_rpc 00:07:04.745 ************************************ 00:07:04.745 23:53:54 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:05.004 * Looking for test storage... 00:07:05.004 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=470130 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@15 -- # waitforlisten 470130 00:07:05.004 23:53:54 -- common/autotest_common.sh@819 -- # '[' -z 470130 ']' 00:07:05.004 23:53:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.004 23:53:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:05.004 23:53:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.004 23:53:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:05.004 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.004 [2024-04-25 23:53:54.389690] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:05.004 [2024-04-25 23:53:54.389740] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470130 ] 00:07:05.004 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.004 [2024-04-25 23:53:54.455842] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.004 [2024-04-25 23:53:54.493328] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:05.004 [2024-04-25 23:53:54.493447] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.004 23:53:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:05.004 23:53:54 -- common/autotest_common.sh@852 -- # return 0 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:05.004 23:53:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:05.004 23:53:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:05.004 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.004 ************************************ 00:07:05.004 START TEST accel_assign_opcode 00:07:05.004 ************************************ 00:07:05.004 23:53:54 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:05.004 23:53:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:05.004 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.004 [2024-04-25 23:53:54.561920] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:05.004 23:53:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:05.004 23:53:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:05.004 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.004 [2024-04-25 23:53:54.569933] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:05.004 23:53:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:05.004 23:53:54 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:05.004 23:53:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:05.004 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.264 23:53:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:05.264 23:53:54 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:05.264 23:53:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:05.264 23:53:54 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:05.264 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.264 23:53:54 -- accel/accel_rpc.sh@42 -- # grep software 00:07:05.264 23:53:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:05.264 software 00:07:05.264 00:07:05.264 real 0m0.219s 00:07:05.264 user 0m0.043s 00:07:05.264 sys 0m0.017s 00:07:05.264 23:53:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.264 23:53:54 -- common/autotest_common.sh@10 -- # set +x 00:07:05.264 ************************************ 00:07:05.264 END TEST accel_assign_opcode 00:07:05.264 ************************************ 00:07:05.264 23:53:54 -- accel/accel_rpc.sh@55 -- # killprocess 470130 00:07:05.264 23:53:54 -- common/autotest_common.sh@926 -- # '[' -z 470130 ']' 00:07:05.264 23:53:54 -- common/autotest_common.sh@930 -- # kill -0 470130 00:07:05.264 23:53:54 -- common/autotest_common.sh@931 -- # uname 00:07:05.264 23:53:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:05.264 23:53:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 470130 00:07:05.264 23:53:54 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:05.264 23:53:54 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:05.264 23:53:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 470130' 00:07:05.264 killing process with pid 470130 00:07:05.264 23:53:54 -- common/autotest_common.sh@945 -- # kill 470130 00:07:05.264 23:53:54 -- common/autotest_common.sh@950 -- # wait 470130 00:07:05.832 00:07:05.832 real 0m0.882s 00:07:05.832 user 0m0.814s 00:07:05.832 sys 0m0.403s 00:07:05.832 23:53:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.832 23:53:55 -- common/autotest_common.sh@10 -- # set +x 00:07:05.832 ************************************ 00:07:05.832 END TEST accel_rpc 00:07:05.832 ************************************ 00:07:05.832 23:53:55 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:05.832 23:53:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:05.832 23:53:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:05.832 23:53:55 -- common/autotest_common.sh@10 -- # set +x 00:07:05.832 ************************************ 00:07:05.832 START TEST app_cmdline 00:07:05.832 ************************************ 00:07:05.832 23:53:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:05.832 * Looking for test storage... 00:07:05.832 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:05.832 23:53:55 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:05.832 23:53:55 -- app/cmdline.sh@17 -- # spdk_tgt_pid=470387 00:07:05.832 23:53:55 -- app/cmdline.sh@18 -- # waitforlisten 470387 00:07:05.832 23:53:55 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:05.832 23:53:55 -- common/autotest_common.sh@819 -- # '[' -z 470387 ']' 00:07:05.832 23:53:55 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.832 23:53:55 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:05.832 23:53:55 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.832 23:53:55 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:05.832 23:53:55 -- common/autotest_common.sh@10 -- # set +x 00:07:05.832 [2024-04-25 23:53:55.330165] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:05.832 [2024-04-25 23:53:55.330234] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470387 ] 00:07:05.832 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.832 [2024-04-25 23:53:55.398072] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.832 [2024-04-25 23:53:55.434486] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:05.832 [2024-04-25 23:53:55.434604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.778 23:53:56 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:06.778 23:53:56 -- common/autotest_common.sh@852 -- # return 0 00:07:06.779 23:53:56 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:06.779 { 00:07:06.779 "version": "SPDK v24.01.1-pre git sha1 36faa8c31", 00:07:06.779 "fields": { 00:07:06.779 "major": 24, 00:07:06.779 "minor": 1, 00:07:06.779 "patch": 1, 00:07:06.779 "suffix": "-pre", 00:07:06.779 "commit": "36faa8c31" 00:07:06.779 } 00:07:06.779 } 00:07:06.779 23:53:56 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:06.779 23:53:56 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:06.779 23:53:56 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:06.779 23:53:56 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:06.779 23:53:56 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:06.779 23:53:56 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:06.779 23:53:56 -- app/cmdline.sh@26 -- # sort 00:07:06.779 23:53:56 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:06.779 23:53:56 -- common/autotest_common.sh@10 -- # set +x 00:07:06.779 23:53:56 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:06.779 23:53:56 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:06.779 23:53:56 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:06.779 23:53:56 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:06.779 23:53:56 -- common/autotest_common.sh@640 -- # local es=0 00:07:06.779 23:53:56 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:06.779 23:53:56 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.779 23:53:56 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:06.779 23:53:56 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.779 23:53:56 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:06.779 23:53:56 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.779 23:53:56 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:06.779 23:53:56 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:06.779 23:53:56 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:06.779 23:53:56 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:07.041 request: 00:07:07.041 { 00:07:07.041 "method": "env_dpdk_get_mem_stats", 00:07:07.041 "req_id": 1 00:07:07.041 } 00:07:07.041 Got JSON-RPC error response 00:07:07.041 response: 00:07:07.041 { 00:07:07.041 "code": -32601, 00:07:07.041 "message": "Method not found" 00:07:07.041 } 00:07:07.041 23:53:56 -- common/autotest_common.sh@643 -- # es=1 00:07:07.041 23:53:56 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:07.041 23:53:56 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:07.041 23:53:56 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:07.041 23:53:56 -- app/cmdline.sh@1 -- # killprocess 470387 00:07:07.041 23:53:56 -- common/autotest_common.sh@926 -- # '[' -z 470387 ']' 00:07:07.041 23:53:56 -- common/autotest_common.sh@930 -- # kill -0 470387 00:07:07.041 23:53:56 -- common/autotest_common.sh@931 -- # uname 00:07:07.041 23:53:56 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:07.041 23:53:56 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 470387 00:07:07.041 23:53:56 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:07.041 23:53:56 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:07.041 23:53:56 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 470387' 00:07:07.041 killing process with pid 470387 00:07:07.041 23:53:56 -- common/autotest_common.sh@945 -- # kill 470387 00:07:07.041 23:53:56 -- common/autotest_common.sh@950 -- # wait 470387 00:07:07.300 00:07:07.300 real 0m1.640s 00:07:07.300 user 0m1.888s 00:07:07.300 sys 0m0.490s 00:07:07.300 23:53:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.300 23:53:56 -- common/autotest_common.sh@10 -- # set +x 00:07:07.300 ************************************ 00:07:07.300 END TEST app_cmdline 00:07:07.300 ************************************ 00:07:07.300 23:53:56 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:07.300 23:53:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:07.300 23:53:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:07.300 23:53:56 -- common/autotest_common.sh@10 -- # set +x 00:07:07.300 ************************************ 00:07:07.300 START TEST version 00:07:07.300 ************************************ 00:07:07.300 23:53:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:07.558 * Looking for test storage... 00:07:07.558 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:07.558 23:53:56 -- app/version.sh@17 -- # get_header_version major 00:07:07.558 23:53:56 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:07.558 23:53:56 -- app/version.sh@14 -- # cut -f2 00:07:07.558 23:53:57 -- app/version.sh@14 -- # tr -d '"' 00:07:07.558 23:53:57 -- app/version.sh@17 -- # major=24 00:07:07.558 23:53:57 -- app/version.sh@18 -- # get_header_version minor 00:07:07.558 23:53:57 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:07.558 23:53:57 -- app/version.sh@14 -- # cut -f2 00:07:07.558 23:53:57 -- app/version.sh@14 -- # tr -d '"' 00:07:07.558 23:53:57 -- app/version.sh@18 -- # minor=1 00:07:07.558 23:53:57 -- app/version.sh@19 -- # get_header_version patch 00:07:07.558 23:53:57 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:07.558 23:53:57 -- app/version.sh@14 -- # cut -f2 00:07:07.558 23:53:57 -- app/version.sh@14 -- # tr -d '"' 00:07:07.558 23:53:57 -- app/version.sh@19 -- # patch=1 00:07:07.558 23:53:57 -- app/version.sh@20 -- # get_header_version suffix 00:07:07.559 23:53:57 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:07.559 23:53:57 -- app/version.sh@14 -- # cut -f2 00:07:07.559 23:53:57 -- app/version.sh@14 -- # tr -d '"' 00:07:07.559 23:53:57 -- app/version.sh@20 -- # suffix=-pre 00:07:07.559 23:53:57 -- app/version.sh@22 -- # version=24.1 00:07:07.559 23:53:57 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:07.559 23:53:57 -- app/version.sh@25 -- # version=24.1.1 00:07:07.559 23:53:57 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:07.559 23:53:57 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:07.559 23:53:57 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:07.559 23:53:57 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:07.559 23:53:57 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:07.559 00:07:07.559 real 0m0.172s 00:07:07.559 user 0m0.102s 00:07:07.559 sys 0m0.110s 00:07:07.559 23:53:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.559 23:53:57 -- common/autotest_common.sh@10 -- # set +x 00:07:07.559 ************************************ 00:07:07.559 END TEST version 00:07:07.559 ************************************ 00:07:07.559 23:53:57 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@204 -- # uname -s 00:07:07.559 23:53:57 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:07.559 23:53:57 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:07.559 23:53:57 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:07.559 23:53:57 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:07.559 23:53:57 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:07.559 23:53:57 -- common/autotest_common.sh@10 -- # set +x 00:07:07.559 23:53:57 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:07.559 23:53:57 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:07.818 23:53:57 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:07.818 23:53:57 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:07.818 23:53:57 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:07.818 23:53:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:07.818 23:53:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:07.818 23:53:57 -- common/autotest_common.sh@10 -- # set +x 00:07:07.818 ************************************ 00:07:07.818 START TEST llvm_fuzz 00:07:07.818 ************************************ 00:07:07.818 23:53:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:07.818 * Looking for test storage... 00:07:07.818 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:07.818 23:53:57 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:07.818 23:53:57 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:07.818 23:53:57 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:07.818 23:53:57 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:07.818 23:53:57 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:07.818 23:53:57 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:07.818 23:53:57 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:07.818 23:53:57 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:07.818 23:53:57 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:07.818 23:53:57 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:07.818 23:53:57 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:07.818 23:53:57 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:07.818 23:53:57 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:07.818 23:53:57 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:07.818 23:53:57 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:07.818 23:53:57 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:07.818 23:53:57 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:07.819 23:53:57 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:07.819 23:53:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:07.819 23:53:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:07.819 23:53:57 -- common/autotest_common.sh@10 -- # set +x 00:07:07.819 ************************************ 00:07:07.819 START TEST nvmf_fuzz 00:07:07.819 ************************************ 00:07:07.819 23:53:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:07.819 * Looking for test storage... 00:07:07.819 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:07.819 23:53:57 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:07.819 23:53:57 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:07.819 23:53:57 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:07.819 23:53:57 -- common/autotest_common.sh@34 -- # set -e 00:07:07.819 23:53:57 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:07.819 23:53:57 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:07.819 23:53:57 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:07.819 23:53:57 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:07.819 23:53:57 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:07.819 23:53:57 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:07.819 23:53:57 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:07.819 23:53:57 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:07.819 23:53:57 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:07.819 23:53:57 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:07.819 23:53:57 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:07.819 23:53:57 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:07.819 23:53:57 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:07.819 23:53:57 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:07.819 23:53:57 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:07.819 23:53:57 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:07.819 23:53:57 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:07.819 23:53:57 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:07.819 23:53:57 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:07.819 23:53:57 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:07.819 23:53:57 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:07.819 23:53:57 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:07.819 23:53:57 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:07.819 23:53:57 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:07.819 23:53:57 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:07.819 23:53:57 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:07.819 23:53:57 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:07.819 23:53:57 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:07.819 23:53:57 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:07.819 23:53:57 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:07.819 23:53:57 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:07.819 23:53:57 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:07.819 23:53:57 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:07.819 23:53:57 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:07.819 23:53:57 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:07.819 23:53:57 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:07.819 23:53:57 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:07.819 23:53:57 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:07.819 23:53:57 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:07.819 23:53:57 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:07.819 23:53:57 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:07.819 23:53:57 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:07.819 23:53:57 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:07.819 23:53:57 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:07.819 23:53:57 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:07.819 23:53:57 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:07.819 23:53:57 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:07.819 23:53:57 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:07.819 23:53:57 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:07.819 23:53:57 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:07.819 23:53:57 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:07.819 23:53:57 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:07.819 23:53:57 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:07.819 23:53:57 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:07.819 23:53:57 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:07.819 23:53:57 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:07.819 23:53:57 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:07.819 23:53:57 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:07.819 23:53:57 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:07.819 23:53:57 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:07.819 23:53:57 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:07.819 23:53:57 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:07.819 23:53:57 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:07.819 23:53:57 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=n 00:07:07.819 23:53:57 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:07.819 23:53:57 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:07.819 23:53:57 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:07.819 23:53:57 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:07.819 23:53:57 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:07.819 23:53:57 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:07.819 23:53:57 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:07.819 23:53:57 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:07.819 23:53:57 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:07.819 23:53:57 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:07.819 23:53:57 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:07.819 23:53:57 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:07.819 23:53:57 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:07.819 23:53:57 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:07.819 23:53:57 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:07.819 23:53:57 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:07.819 23:53:57 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:07.819 23:53:57 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:07.819 23:53:57 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:07.819 23:53:57 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:07.819 23:53:57 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:07.819 23:53:57 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:07.819 23:53:57 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:07.819 23:53:57 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:07.819 23:53:57 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:07.819 23:53:57 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:07.819 23:53:57 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:07.819 23:53:57 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:07.819 23:53:57 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:07.819 23:53:57 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:07.819 23:53:57 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:07.819 23:53:57 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:07.819 23:53:57 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:07.819 23:53:57 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:07.819 23:53:57 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:07.819 #define SPDK_CONFIG_H 00:07:07.819 #define SPDK_CONFIG_APPS 1 00:07:07.819 #define SPDK_CONFIG_ARCH native 00:07:07.819 #undef SPDK_CONFIG_ASAN 00:07:07.819 #undef SPDK_CONFIG_AVAHI 00:07:07.819 #undef SPDK_CONFIG_CET 00:07:07.819 #define SPDK_CONFIG_COVERAGE 1 00:07:07.819 #define SPDK_CONFIG_CROSS_PREFIX 00:07:07.819 #undef SPDK_CONFIG_CRYPTO 00:07:07.819 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:07.819 #undef SPDK_CONFIG_CUSTOMOCF 00:07:07.819 #undef SPDK_CONFIG_DAOS 00:07:07.819 #define SPDK_CONFIG_DAOS_DIR 00:07:07.819 #define SPDK_CONFIG_DEBUG 1 00:07:07.819 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:07.819 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:07.819 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:07.819 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:07.819 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:07.819 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:07.819 #define SPDK_CONFIG_EXAMPLES 1 00:07:07.819 #undef SPDK_CONFIG_FC 00:07:07.819 #define SPDK_CONFIG_FC_PATH 00:07:07.819 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:07.819 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:07.819 #undef SPDK_CONFIG_FUSE 00:07:07.819 #define SPDK_CONFIG_FUZZER 1 00:07:07.819 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:07.819 #undef SPDK_CONFIG_GOLANG 00:07:07.819 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:07.819 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:07.819 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:07.819 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:07.819 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:07.819 #define SPDK_CONFIG_IDXD 1 00:07:07.819 #undef SPDK_CONFIG_IDXD_KERNEL 00:07:07.819 #undef SPDK_CONFIG_IPSEC_MB 00:07:07.819 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:07.819 #define SPDK_CONFIG_ISAL 1 00:07:07.820 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:07.820 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:07.820 #define SPDK_CONFIG_LIBDIR 00:07:07.820 #undef SPDK_CONFIG_LTO 00:07:07.820 #define SPDK_CONFIG_MAX_LCORES 00:07:07.820 #define SPDK_CONFIG_NVME_CUSE 1 00:07:07.820 #undef SPDK_CONFIG_OCF 00:07:07.820 #define SPDK_CONFIG_OCF_PATH 00:07:07.820 #define SPDK_CONFIG_OPENSSL_PATH 00:07:07.820 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:07.820 #undef SPDK_CONFIG_PGO_USE 00:07:07.820 #define SPDK_CONFIG_PREFIX /usr/local 00:07:07.820 #undef SPDK_CONFIG_RAID5F 00:07:07.820 #undef SPDK_CONFIG_RBD 00:07:07.820 #define SPDK_CONFIG_RDMA 1 00:07:07.820 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:07.820 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:07.820 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:07.820 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:07.820 #undef SPDK_CONFIG_SHARED 00:07:07.820 #undef SPDK_CONFIG_SMA 00:07:07.820 #define SPDK_CONFIG_TESTS 1 00:07:07.820 #undef SPDK_CONFIG_TSAN 00:07:07.820 #define SPDK_CONFIG_UBLK 1 00:07:07.820 #define SPDK_CONFIG_UBSAN 1 00:07:07.820 #undef SPDK_CONFIG_UNIT_TESTS 00:07:07.820 #undef SPDK_CONFIG_URING 00:07:07.820 #define SPDK_CONFIG_URING_PATH 00:07:07.820 #undef SPDK_CONFIG_URING_ZNS 00:07:07.820 #undef SPDK_CONFIG_USDT 00:07:07.820 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:07.820 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:07.820 #define SPDK_CONFIG_VFIO_USER 1 00:07:07.820 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:07.820 #define SPDK_CONFIG_VHOST 1 00:07:07.820 #define SPDK_CONFIG_VIRTIO 1 00:07:07.820 #undef SPDK_CONFIG_VTUNE 00:07:07.820 #define SPDK_CONFIG_VTUNE_DIR 00:07:07.820 #define SPDK_CONFIG_WERROR 1 00:07:07.820 #define SPDK_CONFIG_WPDK_DIR 00:07:07.820 #undef SPDK_CONFIG_XNVME 00:07:07.820 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:07.820 23:53:57 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:07.820 23:53:57 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:07.820 23:53:57 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:07.820 23:53:57 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:07.820 23:53:57 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:07.820 23:53:57 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.820 23:53:57 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.820 23:53:57 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.820 23:53:57 -- paths/export.sh@5 -- # export PATH 00:07:07.820 23:53:57 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:07.820 23:53:57 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:08.080 23:53:57 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:08.080 23:53:57 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:08.080 23:53:57 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:08.080 23:53:57 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:08.080 23:53:57 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:08.080 23:53:57 -- pm/common@16 -- # TEST_TAG=N/A 00:07:08.080 23:53:57 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:08.080 23:53:57 -- common/autotest_common.sh@52 -- # : 1 00:07:08.080 23:53:57 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:08.080 23:53:57 -- common/autotest_common.sh@56 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:08.080 23:53:57 -- common/autotest_common.sh@58 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:08.080 23:53:57 -- common/autotest_common.sh@60 -- # : 1 00:07:08.080 23:53:57 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:08.080 23:53:57 -- common/autotest_common.sh@62 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:08.080 23:53:57 -- common/autotest_common.sh@64 -- # : 00:07:08.080 23:53:57 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:08.080 23:53:57 -- common/autotest_common.sh@66 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:08.080 23:53:57 -- common/autotest_common.sh@68 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:08.080 23:53:57 -- common/autotest_common.sh@70 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:08.080 23:53:57 -- common/autotest_common.sh@72 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:08.080 23:53:57 -- common/autotest_common.sh@74 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:08.080 23:53:57 -- common/autotest_common.sh@76 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:08.080 23:53:57 -- common/autotest_common.sh@78 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:08.080 23:53:57 -- common/autotest_common.sh@80 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:08.080 23:53:57 -- common/autotest_common.sh@82 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:08.080 23:53:57 -- common/autotest_common.sh@84 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:08.080 23:53:57 -- common/autotest_common.sh@86 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:08.080 23:53:57 -- common/autotest_common.sh@88 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:08.080 23:53:57 -- common/autotest_common.sh@90 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:08.080 23:53:57 -- common/autotest_common.sh@92 -- # : 1 00:07:08.080 23:53:57 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:08.080 23:53:57 -- common/autotest_common.sh@94 -- # : 1 00:07:08.080 23:53:57 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:08.080 23:53:57 -- common/autotest_common.sh@96 -- # : rdma 00:07:08.080 23:53:57 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:08.080 23:53:57 -- common/autotest_common.sh@98 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:08.080 23:53:57 -- common/autotest_common.sh@100 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:08.080 23:53:57 -- common/autotest_common.sh@102 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:08.080 23:53:57 -- common/autotest_common.sh@104 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:08.080 23:53:57 -- common/autotest_common.sh@106 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:08.080 23:53:57 -- common/autotest_common.sh@108 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:08.080 23:53:57 -- common/autotest_common.sh@110 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:08.080 23:53:57 -- common/autotest_common.sh@112 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:08.080 23:53:57 -- common/autotest_common.sh@114 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:08.080 23:53:57 -- common/autotest_common.sh@116 -- # : 1 00:07:08.080 23:53:57 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:08.080 23:53:57 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:08.080 23:53:57 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:08.080 23:53:57 -- common/autotest_common.sh@120 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:08.080 23:53:57 -- common/autotest_common.sh@122 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:08.080 23:53:57 -- common/autotest_common.sh@124 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:08.080 23:53:57 -- common/autotest_common.sh@126 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:08.080 23:53:57 -- common/autotest_common.sh@128 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:08.080 23:53:57 -- common/autotest_common.sh@130 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:08.080 23:53:57 -- common/autotest_common.sh@132 -- # : v23.11 00:07:08.080 23:53:57 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:08.080 23:53:57 -- common/autotest_common.sh@134 -- # : true 00:07:08.080 23:53:57 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:08.080 23:53:57 -- common/autotest_common.sh@136 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:08.080 23:53:57 -- common/autotest_common.sh@138 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:08.080 23:53:57 -- common/autotest_common.sh@140 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:08.080 23:53:57 -- common/autotest_common.sh@142 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:08.080 23:53:57 -- common/autotest_common.sh@144 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:08.080 23:53:57 -- common/autotest_common.sh@146 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:08.080 23:53:57 -- common/autotest_common.sh@148 -- # : 00:07:08.080 23:53:57 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:08.080 23:53:57 -- common/autotest_common.sh@150 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:08.080 23:53:57 -- common/autotest_common.sh@152 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:08.080 23:53:57 -- common/autotest_common.sh@154 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:08.080 23:53:57 -- common/autotest_common.sh@156 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:08.080 23:53:57 -- common/autotest_common.sh@158 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:08.080 23:53:57 -- common/autotest_common.sh@160 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:08.080 23:53:57 -- common/autotest_common.sh@163 -- # : 00:07:08.080 23:53:57 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:08.080 23:53:57 -- common/autotest_common.sh@165 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:08.080 23:53:57 -- common/autotest_common.sh@167 -- # : 0 00:07:08.080 23:53:57 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:08.080 23:53:57 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:08.080 23:53:57 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:08.080 23:53:57 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:08.080 23:53:57 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:08.080 23:53:57 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:08.080 23:53:57 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:08.080 23:53:57 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:08.080 23:53:57 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:08.080 23:53:57 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:08.080 23:53:57 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:08.080 23:53:57 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:08.080 23:53:57 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:08.080 23:53:57 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:08.080 23:53:57 -- common/autotest_common.sh@196 -- # cat 00:07:08.080 23:53:57 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:08.080 23:53:57 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:08.080 23:53:57 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:08.080 23:53:57 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:08.080 23:53:57 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:08.080 23:53:57 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:08.080 23:53:57 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:08.080 23:53:57 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:08.080 23:53:57 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:08.080 23:53:57 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:08.080 23:53:57 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:08.080 23:53:57 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:08.080 23:53:57 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:08.080 23:53:57 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:08.080 23:53:57 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:08.080 23:53:57 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:08.080 23:53:57 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:08.080 23:53:57 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:08.080 23:53:57 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:08.080 23:53:57 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:08.080 23:53:57 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:08.080 23:53:57 -- common/autotest_common.sh@249 -- # valgrind= 00:07:08.080 23:53:57 -- common/autotest_common.sh@255 -- # uname -s 00:07:08.080 23:53:57 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:08.080 23:53:57 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:08.080 23:53:57 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:08.080 23:53:57 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:08.080 23:53:57 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:08.080 23:53:57 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:08.080 23:53:57 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:08.080 23:53:57 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:08.080 23:53:57 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:08.080 23:53:57 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:08.080 23:53:57 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:08.080 23:53:57 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:08.080 23:53:57 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:08.080 23:53:57 -- common/autotest_common.sh@309 -- # [[ -z 470886 ]] 00:07:08.080 23:53:57 -- common/autotest_common.sh@309 -- # kill -0 470886 00:07:08.080 23:53:57 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:08.080 23:53:57 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:08.080 23:53:57 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:08.080 23:53:57 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:08.080 23:53:57 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:08.080 23:53:57 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:08.080 23:53:57 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:08.080 23:53:57 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:08.080 23:53:57 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.rFeFiU 00:07:08.080 23:53:57 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:08.080 23:53:57 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:08.080 23:53:57 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:08.080 23:53:57 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.rFeFiU/tests/nvmf /tmp/spdk.rFeFiU 00:07:08.080 23:53:57 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@318 -- # df -T 00:07:08.080 23:53:57 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:08.080 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=1052192768 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:08.080 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=4232237056 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=52984492032 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742297088 00:07:08.080 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=8757805056 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=30869889024 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871146496 00:07:08.080 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=1257472 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342480896 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348461056 00:07:08.080 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870966272 00:07:08.080 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871150592 00:07:08.080 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=184320 00:07:08.080 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.080 23:53:57 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:08.081 23:53:57 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:08.081 23:53:57 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:08.081 23:53:57 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:08.081 23:53:57 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:08.081 23:53:57 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:08.081 23:53:57 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:08.081 * Looking for test storage... 00:07:08.081 23:53:57 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:08.081 23:53:57 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:08.081 23:53:57 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:08.081 23:53:57 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:08.081 23:53:57 -- common/autotest_common.sh@363 -- # mount=/ 00:07:08.081 23:53:57 -- common/autotest_common.sh@365 -- # target_space=52984492032 00:07:08.081 23:53:57 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:08.081 23:53:57 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:08.081 23:53:57 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:08.081 23:53:57 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:08.081 23:53:57 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:08.081 23:53:57 -- common/autotest_common.sh@372 -- # new_size=10972397568 00:07:08.081 23:53:57 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:08.081 23:53:57 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:08.081 23:53:57 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:08.081 23:53:57 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:08.081 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:08.081 23:53:57 -- common/autotest_common.sh@380 -- # return 0 00:07:08.081 23:53:57 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:08.081 23:53:57 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:08.081 23:53:57 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:08.081 23:53:57 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:08.081 23:53:57 -- common/autotest_common.sh@1672 -- # true 00:07:08.081 23:53:57 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:08.081 23:53:57 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:08.081 23:53:57 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:08.081 23:53:57 -- common/autotest_common.sh@27 -- # exec 00:07:08.081 23:53:57 -- common/autotest_common.sh@29 -- # exec 00:07:08.081 23:53:57 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:08.081 23:53:57 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:08.081 23:53:57 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:08.081 23:53:57 -- common/autotest_common.sh@18 -- # set -x 00:07:08.081 23:53:57 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:08.081 23:53:57 -- ../common.sh@8 -- # pids=() 00:07:08.081 23:53:57 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:08.081 23:53:57 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:08.081 23:53:57 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:08.081 23:53:57 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:08.081 23:53:57 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:08.081 23:53:57 -- nvmf/run.sh@61 -- # mem_size=512 00:07:08.081 23:53:57 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:08.081 23:53:57 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:08.081 23:53:57 -- ../common.sh@69 -- # local fuzz_num=25 00:07:08.081 23:53:57 -- ../common.sh@70 -- # local time=1 00:07:08.081 23:53:57 -- ../common.sh@72 -- # (( i = 0 )) 00:07:08.081 23:53:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:08.081 23:53:57 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:08.081 23:53:57 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:08.081 23:53:57 -- nvmf/run.sh@24 -- # local timen=1 00:07:08.081 23:53:57 -- nvmf/run.sh@25 -- # local core=0x1 00:07:08.081 23:53:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:08.081 23:53:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:08.081 23:53:57 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:08.081 23:53:57 -- nvmf/run.sh@29 -- # port=4400 00:07:08.081 23:53:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:08.081 23:53:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:08.081 23:53:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:08.081 23:53:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:08.081 [2024-04-25 23:53:57.595204] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:08.081 [2024-04-25 23:53:57.595310] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470924 ] 00:07:08.081 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.339 [2024-04-25 23:53:57.860611] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.339 [2024-04-25 23:53:57.885918] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:08.339 [2024-04-25 23:53:57.886046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.339 [2024-04-25 23:53:57.937577] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:08.598 [2024-04-25 23:53:57.953879] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:08.598 INFO: Running with entropic power schedule (0xFF, 100). 00:07:08.598 INFO: Seed: 756403730 00:07:08.598 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:08.598 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:08.598 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:08.598 INFO: A corpus is not provided, starting from an empty corpus 00:07:08.598 #2 INITED exec/s: 0 rss: 59Mb 00:07:08.598 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:08.598 This may also happen if the target rejected all inputs we tried so far 00:07:08.598 [2024-04-25 23:53:58.008994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:08.598 [2024-04-25 23:53:58.009022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.857 NEW_FUNC[1/661]: 0x49d5e0 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:08.857 NEW_FUNC[2/661]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:08.857 #16 NEW cov: 11431 ft: 11432 corp: 2/82b lim: 320 exec/s: 0 rss: 67Mb L: 81/81 MS: 4 CopyPart-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:07:08.857 [2024-04-25 23:53:58.319838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:08.857 [2024-04-25 23:53:58.319871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.857 NEW_FUNC[1/1]: 0x17796b0 in nvme_tcp_qpair /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:171 00:07:08.857 #17 NEW cov: 11548 ft: 11858 corp: 3/163b lim: 320 exec/s: 0 rss: 67Mb L: 81/81 MS: 1 ChangeBit- 00:07:08.857 [2024-04-25 23:53:58.369903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:08.857 [2024-04-25 23:53:58.369932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.857 #18 NEW cov: 11554 ft: 12105 corp: 4/244b lim: 320 exec/s: 0 rss: 67Mb L: 81/81 MS: 1 ChangeBinInt- 00:07:08.857 [2024-04-25 23:53:58.409978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:08.857 [2024-04-25 23:53:58.410005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.857 #19 NEW cov: 11639 ft: 12462 corp: 5/326b lim: 320 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 InsertByte- 00:07:08.857 [2024-04-25 23:53:58.450146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:08.857 [2024-04-25 23:53:58.450171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #20 NEW cov: 11639 ft: 12536 corp: 6/392b lim: 320 exec/s: 0 rss: 67Mb L: 66/82 MS: 1 EraseBytes- 00:07:09.115 [2024-04-25 23:53:58.480211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.480239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #26 NEW cov: 11639 ft: 12711 corp: 7/474b lim: 320 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 ChangeBinInt- 00:07:09.115 [2024-04-25 23:53:58.520385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.520414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #27 NEW cov: 11639 ft: 12771 corp: 8/555b lim: 320 exec/s: 0 rss: 68Mb L: 81/82 MS: 1 ChangeByte- 00:07:09.115 [2024-04-25 23:53:58.560445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.560470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #28 NEW cov: 11639 ft: 12799 corp: 9/637b lim: 320 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 InsertByte- 00:07:09.115 [2024-04-25 23:53:58.590588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.590613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #29 NEW cov: 11639 ft: 12868 corp: 10/718b lim: 320 exec/s: 0 rss: 68Mb L: 81/82 MS: 1 ChangeByte- 00:07:09.115 [2024-04-25 23:53:58.620621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.620645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #30 NEW cov: 11639 ft: 12887 corp: 11/799b lim: 320 exec/s: 0 rss: 68Mb L: 81/82 MS: 1 ShuffleBytes- 00:07:09.115 [2024-04-25 23:53:58.660757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.660782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #31 NEW cov: 11639 ft: 12982 corp: 12/881b lim: 320 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 ShuffleBytes- 00:07:09.115 [2024-04-25 23:53:58.690823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.115 [2024-04-25 23:53:58.690848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.115 #37 NEW cov: 11639 ft: 13035 corp: 13/962b lim: 320 exec/s: 0 rss: 68Mb L: 81/82 MS: 1 CrossOver- 00:07:09.374 [2024-04-25 23:53:58.730957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.374 [2024-04-25 23:53:58.730982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 #38 NEW cov: 11639 ft: 13087 corp: 14/1043b lim: 320 exec/s: 0 rss: 68Mb L: 81/82 MS: 1 ChangeByte- 00:07:09.374 [2024-04-25 23:53:58.761056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.374 [2024-04-25 23:53:58.761081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 #39 NEW cov: 11639 ft: 13118 corp: 15/1124b lim: 320 exec/s: 0 rss: 68Mb L: 81/82 MS: 1 ChangeByte- 00:07:09.374 [2024-04-25 23:53:58.801118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:ff5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.374 [2024-04-25 23:53:58.801143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 #40 NEW cov: 11639 ft: 13162 corp: 16/1213b lim: 320 exec/s: 0 rss: 68Mb L: 89/89 MS: 1 CMP- DE: "\377\377\377\377\376\377\377\377"- 00:07:09.374 [2024-04-25 23:53:58.841225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:79797979 cdw11:79797979 00:07:09.374 [2024-04-25 23:53:58.841250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 #41 NEW cov: 11639 ft: 13189 corp: 17/1336b lim: 320 exec/s: 0 rss: 68Mb L: 123/123 MS: 1 InsertRepeatedBytes- 00:07:09.374 [2024-04-25 23:53:58.881351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:ff5c5c5c cdw10:5c5c5c5c cdw11:5c635c01 00:07:09.374 [2024-04-25 23:53:58.881378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:09.374 #42 NEW cov: 11662 ft: 13315 corp: 18/1446b lim: 320 exec/s: 0 rss: 68Mb L: 110/123 MS: 1 CrossOver- 00:07:09.374 [2024-04-25 23:53:58.921457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (50) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.374 [2024-04-25 23:53:58.921483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 #45 NEW cov: 11662 ft: 13318 corp: 19/1514b lim: 320 exec/s: 0 rss: 68Mb L: 68/123 MS: 3 ChangeByte-InsertByte-CrossOver- 00:07:09.374 [2024-04-25 23:53:58.961607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.374 [2024-04-25 23:53:58.961632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.374 #46 NEW cov: 11662 ft: 13383 corp: 20/1595b lim: 320 exec/s: 0 rss: 68Mb L: 81/123 MS: 1 CMP- DE: "^y\352\332K\025w\000"- 00:07:09.632 [2024-04-25 23:53:58.991668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:58.991694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #47 NEW cov: 11662 ft: 13399 corp: 21/1666b lim: 320 exec/s: 47 rss: 68Mb L: 71/123 MS: 1 EraseBytes- 00:07:09.632 [2024-04-25 23:53:59.021771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:59.021796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #48 NEW cov: 11662 ft: 13483 corp: 22/1736b lim: 320 exec/s: 48 rss: 68Mb L: 70/123 MS: 1 EraseBytes- 00:07:09.632 [2024-04-25 23:53:59.061890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:79797979 cdw11:79797979 00:07:09.632 [2024-04-25 23:53:59.061915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #49 NEW cov: 11662 ft: 13492 corp: 23/1859b lim: 320 exec/s: 49 rss: 69Mb L: 123/123 MS: 1 ShuffleBytes- 00:07:09.632 [2024-04-25 23:53:59.101996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:59.102022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #50 NEW cov: 11662 ft: 13507 corp: 24/1948b lim: 320 exec/s: 50 rss: 69Mb L: 89/123 MS: 1 PersAutoDict- DE: "\377\377\377\377\376\377\377\377"- 00:07:09.632 [2024-04-25 23:53:59.132099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:59.132123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #51 NEW cov: 11662 ft: 13565 corp: 25/2030b lim: 320 exec/s: 51 rss: 69Mb L: 82/123 MS: 1 InsertByte- 00:07:09.632 [2024-04-25 23:53:59.162199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (50) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:59.162227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #52 NEW cov: 11662 ft: 13576 corp: 26/2099b lim: 320 exec/s: 52 rss: 69Mb L: 69/123 MS: 1 InsertByte- 00:07:09.632 [2024-04-25 23:53:59.202332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:795c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:59.202357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.632 #53 NEW cov: 11662 ft: 13627 corp: 27/2180b lim: 320 exec/s: 53 rss: 69Mb L: 81/123 MS: 1 CrossOver- 00:07:09.632 [2024-04-25 23:53:59.242471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.632 [2024-04-25 23:53:59.242497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #54 NEW cov: 11662 ft: 13692 corp: 28/2261b lim: 320 exec/s: 54 rss: 69Mb L: 81/123 MS: 1 ChangeByte- 00:07:09.891 [2024-04-25 23:53:59.272578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.891 [2024-04-25 23:53:59.272603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #55 NEW cov: 11662 ft: 13700 corp: 29/2346b lim: 320 exec/s: 55 rss: 69Mb L: 85/123 MS: 1 InsertRepeatedBytes- 00:07:09.891 [2024-04-25 23:53:59.312637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (50) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.891 [2024-04-25 23:53:59.312662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #56 NEW cov: 11662 ft: 13710 corp: 30/2436b lim: 320 exec/s: 56 rss: 69Mb L: 90/123 MS: 1 InsertRepeatedBytes- 00:07:09.891 [2024-04-25 23:53:59.352750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.891 [2024-04-25 23:53:59.352774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #57 NEW cov: 11662 ft: 13721 corp: 31/2519b lim: 320 exec/s: 57 rss: 69Mb L: 83/123 MS: 1 InsertByte- 00:07:09.891 [2024-04-25 23:53:59.392871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.891 [2024-04-25 23:53:59.392896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #58 NEW cov: 11662 ft: 13728 corp: 32/2604b lim: 320 exec/s: 58 rss: 69Mb L: 85/123 MS: 1 CrossOver- 00:07:09.891 [2024-04-25 23:53:59.433016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.891 [2024-04-25 23:53:59.433042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #59 NEW cov: 11662 ft: 13734 corp: 33/2685b lim: 320 exec/s: 59 rss: 69Mb L: 81/123 MS: 1 ChangeBit- 00:07:09.891 [2024-04-25 23:53:59.463071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:09.891 [2024-04-25 23:53:59.463097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.891 #60 NEW cov: 11662 ft: 13774 corp: 34/2766b lim: 320 exec/s: 60 rss: 69Mb L: 81/123 MS: 1 ChangeBinInt- 00:07:10.150 [2024-04-25 23:53:59.503200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:b0b0b0b0 cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.150 [2024-04-25 23:53:59.503225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #61 NEW cov: 11662 ft: 13812 corp: 35/2848b lim: 320 exec/s: 61 rss: 70Mb L: 82/123 MS: 1 InsertRepeatedBytes- 00:07:10.150 [2024-04-25 23:53:59.543290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.150 [2024-04-25 23:53:59.543317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #62 NEW cov: 11662 ft: 13838 corp: 36/2919b lim: 320 exec/s: 62 rss: 70Mb L: 71/123 MS: 1 ChangeBinInt- 00:07:10.150 [2024-04-25 23:53:59.583402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:ff5c5c5c cdw10:5c5c5c5c cdw11:5c635c01 00:07:10.150 [2024-04-25 23:53:59.583428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #63 NEW cov: 11662 ft: 13841 corp: 37/3029b lim: 320 exec/s: 63 rss: 70Mb L: 110/123 MS: 1 ChangeByte- 00:07:10.150 [2024-04-25 23:53:59.623539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.150 [2024-04-25 23:53:59.623566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #64 NEW cov: 11662 ft: 13851 corp: 38/3114b lim: 320 exec/s: 64 rss: 70Mb L: 85/123 MS: 1 ChangeBit- 00:07:10.150 [2024-04-25 23:53:59.663676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.150 [2024-04-25 23:53:59.663701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #65 NEW cov: 11662 ft: 13864 corp: 39/3196b lim: 320 exec/s: 65 rss: 70Mb L: 82/123 MS: 1 InsertByte- 00:07:10.150 [2024-04-25 23:53:59.703804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:fffeffff cdw11:5c5cffff 00:07:10.150 [2024-04-25 23:53:59.703829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #66 NEW cov: 11662 ft: 13868 corp: 40/3277b lim: 320 exec/s: 66 rss: 70Mb L: 81/123 MS: 1 PersAutoDict- DE: "\377\377\377\377\376\377\377\377"- 00:07:10.150 [2024-04-25 23:53:59.733867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:d8d8d8d8 cdw11:d8d8d8d8 00:07:10.150 [2024-04-25 23:53:59.733893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.150 #67 NEW cov: 11662 ft: 13878 corp: 41/3382b lim: 320 exec/s: 67 rss: 70Mb L: 105/123 MS: 1 InsertRepeatedBytes- 00:07:10.409 [2024-04-25 23:53:59.773974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.409 [2024-04-25 23:53:59.774000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.409 #68 NEW cov: 11662 ft: 13891 corp: 42/3471b lim: 320 exec/s: 68 rss: 70Mb L: 89/123 MS: 1 PersAutoDict- DE: "^y\352\332K\025w\000"- 00:07:10.409 [2024-04-25 23:53:59.814284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.409 [2024-04-25 23:53:59.814310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.409 [2024-04-25 23:53:59.814367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (51) qid:0 cid:5 nsid:51515151 cdw10:51515151 cdw11:51515151 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.409 [2024-04-25 23:53:59.814381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.409 NEW_FUNC[1/2]: 0x16f5660 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:07:10.409 NEW_FUNC[2/2]: 0x16f61c0 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:10.409 #74 NEW cov: 11695 ft: 14408 corp: 43/3605b lim: 320 exec/s: 74 rss: 70Mb L: 134/134 MS: 1 InsertRepeatedBytes- 00:07:10.409 [2024-04-25 23:53:59.854255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5cdc5c5c 00:07:10.409 [2024-04-25 23:53:59.854281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.409 #75 NEW cov: 11695 ft: 14464 corp: 44/3690b lim: 320 exec/s: 75 rss: 70Mb L: 85/134 MS: 1 CopyPart- 00:07:10.409 [2024-04-25 23:53:59.894377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.409 [2024-04-25 23:53:59.894408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.409 #76 NEW cov: 11695 ft: 14521 corp: 45/3771b lim: 320 exec/s: 76 rss: 70Mb L: 81/134 MS: 1 ChangeBinInt- 00:07:10.409 [2024-04-25 23:53:59.934654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:79797979 cdw11:79797979 00:07:10.409 [2024-04-25 23:53:59.934679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.409 [2024-04-25 23:53:59.934742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e7) qid:0 cid:5 nsid:e7e7e7e7 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe7e7e7e7e7e7e7e7 00:07:10.409 [2024-04-25 23:53:59.934756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.409 [2024-04-25 23:53:59.934817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e7) qid:0 cid:6 nsid:5c63ace7 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x5c5c5c5c5c5c5c5c 00:07:10.409 [2024-04-25 23:53:59.934831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.409 NEW_FUNC[1/1]: 0x12f6060 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:10.409 #77 NEW cov: 11727 ft: 14708 corp: 46/3969b lim: 320 exec/s: 77 rss: 70Mb L: 198/198 MS: 1 InsertRepeatedBytes- 00:07:10.409 [2024-04-25 23:53:59.984681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (5c) qid:0 cid:4 nsid:5c5c5c5c cdw10:5c5c5c5c cdw11:5c5c5c5c 00:07:10.409 [2024-04-25 23:53:59.984706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.409 [2024-04-25 23:53:59.984770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e3) qid:0 cid:5 nsid:e3e3e3e3 cdw10:e3e3e3e3 cdw11:e3e3e3e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe3e3e3e3e3e3e3e3 00:07:10.409 [2024-04-25 23:53:59.984784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.409 #78 NEW cov: 11727 ft: 14758 corp: 47/4121b lim: 320 exec/s: 39 rss: 70Mb L: 152/198 MS: 1 InsertRepeatedBytes- 00:07:10.409 #78 DONE cov: 11727 ft: 14758 corp: 47/4121b lim: 320 exec/s: 39 rss: 70Mb 00:07:10.409 ###### Recommended dictionary. ###### 00:07:10.409 "\377\377\377\377\376\377\377\377" # Uses: 2 00:07:10.409 "^y\352\332K\025w\000" # Uses: 1 00:07:10.409 ###### End of recommended dictionary. ###### 00:07:10.409 Done 78 runs in 2 second(s) 00:07:10.668 23:54:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:10.668 23:54:00 -- ../common.sh@72 -- # (( i++ )) 00:07:10.668 23:54:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:10.668 23:54:00 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:10.668 23:54:00 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:10.668 23:54:00 -- nvmf/run.sh@24 -- # local timen=1 00:07:10.668 23:54:00 -- nvmf/run.sh@25 -- # local core=0x1 00:07:10.668 23:54:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:10.668 23:54:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:10.668 23:54:00 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:10.668 23:54:00 -- nvmf/run.sh@29 -- # port=4401 00:07:10.668 23:54:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:10.668 23:54:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:10.668 23:54:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:10.668 23:54:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:10.668 [2024-04-25 23:54:00.159658] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:10.668 [2024-04-25 23:54:00.159747] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471486 ] 00:07:10.668 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.926 [2024-04-25 23:54:00.343035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.926 [2024-04-25 23:54:00.362439] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:10.926 [2024-04-25 23:54:00.362566] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.926 [2024-04-25 23:54:00.413997] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:10.926 [2024-04-25 23:54:00.430304] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:10.926 INFO: Running with entropic power schedule (0xFF, 100). 00:07:10.926 INFO: Seed: 3231421133 00:07:10.926 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:10.926 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:10.927 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:10.927 INFO: A corpus is not provided, starting from an empty corpus 00:07:10.927 #2 INITED exec/s: 0 rss: 59Mb 00:07:10.927 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:10.927 This may also happen if the target rejected all inputs we tried so far 00:07:10.927 [2024-04-25 23:54:00.496174] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:10.927 [2024-04-25 23:54:00.496330] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:10.927 [2024-04-25 23:54:00.496486] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:10.927 [2024-04-25 23:54:00.496619] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:10.927 [2024-04-25 23:54:00.496966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.927 [2024-04-25 23:54:00.497001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.927 [2024-04-25 23:54:00.497127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.927 [2024-04-25 23:54:00.497146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.927 [2024-04-25 23:54:00.497268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.927 [2024-04-25 23:54:00.497287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.927 [2024-04-25 23:54:00.497406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.927 [2024-04-25 23:54:00.497424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.493 NEW_FUNC[1/664]: 0x49dee0 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:11.493 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:11.493 #10 NEW cov: 11531 ft: 11532 corp: 2/28b lim: 30 exec/s: 0 rss: 67Mb L: 27/27 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:11.493 [2024-04-25 23:54:00.836974] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.837146] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.837325] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.837480] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.837823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.837868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.837968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.837991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.838114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.838133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.838256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.838280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.493 #11 NEW cov: 11644 ft: 12221 corp: 3/57b lim: 30 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CopyPart- 00:07:11.493 [2024-04-25 23:54:00.887114] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.887280] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.887441] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.887586] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.887895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.887929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.888046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.888064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.888190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:18d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.888209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.888328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.888347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.493 #12 NEW cov: 11650 ft: 12459 corp: 4/85b lim: 30 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 InsertByte- 00:07:11.493 [2024-04-25 23:54:00.927146] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.927308] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.927454] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000051d1 00:07:11.493 [2024-04-25 23:54:00.927599] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.927943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.927971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.928084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.928100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.928210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.928229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.928336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.928352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.493 #13 NEW cov: 11735 ft: 12713 corp: 5/112b lim: 30 exec/s: 0 rss: 67Mb L: 27/29 MS: 1 ChangeBit- 00:07:11.493 [2024-04-25 23:54:00.967288] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd1d1 00:07:11.493 [2024-04-25 23:54:00.967434] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.967577] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.967725] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:00.968084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d100d1 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.968115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.968233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.968253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.968372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.968390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:00.968502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:00.968520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.493 #14 NEW cov: 11735 ft: 12778 corp: 6/139b lim: 30 exec/s: 0 rss: 67Mb L: 27/29 MS: 1 ChangeBit- 00:07:11.493 [2024-04-25 23:54:01.007393] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd1d1 00:07:11.493 [2024-04-25 23:54:01.007567] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.007707] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.008048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d100d1 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.008077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:01.008202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.008218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:01.008344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.008361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.493 #15 NEW cov: 11735 ft: 13353 corp: 7/160b lim: 30 exec/s: 0 rss: 67Mb L: 21/29 MS: 1 EraseBytes- 00:07:11.493 [2024-04-25 23:54:01.057542] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.057708] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.057853] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.057993] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.058310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.058339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:01.058449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.058466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:01.058578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.058594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.493 [2024-04-25 23:54:01.058701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.493 [2024-04-25 23:54:01.058717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.493 #16 NEW cov: 11735 ft: 13450 corp: 8/188b lim: 30 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 InsertByte- 00:07:11.493 [2024-04-25 23:54:01.097742] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.097903] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.493 [2024-04-25 23:54:01.098046] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d151 00:07:11.494 [2024-04-25 23:54:01.098190] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.494 [2024-04-25 23:54:01.098533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.494 [2024-04-25 23:54:01.098562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.494 [2024-04-25 23:54:01.098690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.494 [2024-04-25 23:54:01.098708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.494 [2024-04-25 23:54:01.098826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.494 [2024-04-25 23:54:01.098844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.494 [2024-04-25 23:54:01.098961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.494 [2024-04-25 23:54:01.098977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.752 #17 NEW cov: 11735 ft: 13543 corp: 9/216b lim: 30 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 CopyPart- 00:07:11.752 [2024-04-25 23:54:01.147918] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.752 [2024-04-25 23:54:01.148082] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.752 [2024-04-25 23:54:01.148230] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000051d1 00:07:11.752 [2024-04-25 23:54:01.148382] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.752 [2024-04-25 23:54:01.148706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.148734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.752 [2024-04-25 23:54:01.148856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.148874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.752 [2024-04-25 23:54:01.148943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.148961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.752 [2024-04-25 23:54:01.149074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.149090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.752 #18 NEW cov: 11735 ft: 13588 corp: 10/241b lim: 30 exec/s: 0 rss: 67Mb L: 25/29 MS: 1 EraseBytes- 00:07:11.752 [2024-04-25 23:54:01.187853] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:11.752 [2024-04-25 23:54:01.188004] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:11.752 [2024-04-25 23:54:01.188157] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:11.752 [2024-04-25 23:54:01.188488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2dff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.188513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.752 [2024-04-25 23:54:01.188626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.188644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.752 [2024-04-25 23:54:01.188759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.752 [2024-04-25 23:54:01.188777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.752 #23 NEW cov: 11735 ft: 13636 corp: 11/263b lim: 30 exec/s: 0 rss: 67Mb L: 22/29 MS: 5 ShuffleBytes-ShuffleBytes-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:11.752 [2024-04-25 23:54:01.228041] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300006b6b 00:07:11.753 [2024-04-25 23:54:01.228226] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300006b6b 00:07:11.753 [2024-04-25 23:54:01.228384] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300006b6b 00:07:11.753 [2024-04-25 23:54:01.228541] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300006b6b 00:07:11.753 [2024-04-25 23:54:01.228881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:6b6b836b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.228909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.229027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:6b6b836b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.229044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.229158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:6b6b836b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.229175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.229294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:6b6b836b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.229311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.753 #24 NEW cov: 11735 ft: 13716 corp: 12/288b lim: 30 exec/s: 0 rss: 67Mb L: 25/29 MS: 1 InsertRepeatedBytes- 00:07:11.753 [2024-04-25 23:54:01.268179] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.753 [2024-04-25 23:54:01.268346] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.753 [2024-04-25 23:54:01.268496] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:11.753 [2024-04-25 23:54:01.268643] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.753 [2024-04-25 23:54:01.269010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.269037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.269146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.269164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.269274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.269289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.269400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.269418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.753 #25 NEW cov: 11735 ft: 13760 corp: 13/314b lim: 30 exec/s: 0 rss: 68Mb L: 26/29 MS: 1 InsertByte- 00:07:11.753 [2024-04-25 23:54:01.318328] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.753 [2024-04-25 23:54:01.318497] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.753 [2024-04-25 23:54:01.318648] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:11.753 [2024-04-25 23:54:01.318794] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:11.753 [2024-04-25 23:54:01.319142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:372e81d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.319169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.319286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.319303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.319412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.319430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.753 [2024-04-25 23:54:01.319544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.753 [2024-04-25 23:54:01.319562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.753 #26 NEW cov: 11735 ft: 13799 corp: 14/340b lim: 30 exec/s: 0 rss: 68Mb L: 26/29 MS: 1 ChangeBinInt- 00:07:12.011 [2024-04-25 23:54:01.368559] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.011 [2024-04-25 23:54:01.368715] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.011 [2024-04-25 23:54:01.368861] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.011 [2024-04-25 23:54:01.369017] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.369375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.369408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.369520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.369535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.369650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.369665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.369772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.369790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.012 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:12.012 #27 NEW cov: 11758 ft: 13854 corp: 15/367b lim: 30 exec/s: 0 rss: 68Mb L: 27/29 MS: 1 ShuffleBytes- 00:07:12.012 [2024-04-25 23:54:01.408603] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.408757] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.408910] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:12.012 [2024-04-25 23:54:01.409054] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.409401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.409428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.409539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.409556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.409668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.409686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.409810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:27d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.409828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.012 #28 NEW cov: 11758 ft: 13872 corp: 16/393b lim: 30 exec/s: 0 rss: 68Mb L: 26/29 MS: 1 ChangeByte- 00:07:12.012 [2024-04-25 23:54:01.448688] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.448852] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.449006] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.449156] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.449521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.449549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.449658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.449674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.449786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:18d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.449804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.449917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d19181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.449934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.012 #29 NEW cov: 11758 ft: 13917 corp: 17/421b lim: 30 exec/s: 29 rss: 68Mb L: 28/29 MS: 1 ChangeBit- 00:07:12.012 [2024-04-25 23:54:01.488900] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.489070] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.489224] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:12.012 [2024-04-25 23:54:01.489366] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd1 00:07:12.012 [2024-04-25 23:54:01.489523] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d10b 00:07:12.012 [2024-04-25 23:54:01.489868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.489895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.490008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.490026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.490147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.490163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.490275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.490293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.490412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.490429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.012 #30 NEW cov: 11758 ft: 13969 corp: 18/451b lim: 30 exec/s: 30 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:12.012 [2024-04-25 23:54:01.528939] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.529114] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.529266] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.529421] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.529758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.529786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.529907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.529925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.530051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.530069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.530186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.530202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.012 #31 NEW cov: 11758 ft: 13972 corp: 19/475b lim: 30 exec/s: 31 rss: 68Mb L: 24/30 MS: 1 CrossOver- 00:07:12.012 [2024-04-25 23:54:01.569037] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.569197] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.569352] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d151 00:07:12.012 [2024-04-25 23:54:01.569513] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.569834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.569862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.569978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.569999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.570124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:4fd181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.570140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.570258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2ad181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.570279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.012 #32 NEW cov: 11758 ft: 13998 corp: 20/502b lim: 30 exec/s: 32 rss: 68Mb L: 27/30 MS: 1 InsertByte- 00:07:12.012 [2024-04-25 23:54:01.609212] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.609375] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.609529] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.609679] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.012 [2024-04-25 23:54:01.609993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.012 [2024-04-25 23:54:01.610020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.012 [2024-04-25 23:54:01.610137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181ba cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.013 [2024-04-25 23:54:01.610153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.013 [2024-04-25 23:54:01.610264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:18d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.013 [2024-04-25 23:54:01.610281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.013 [2024-04-25 23:54:01.610390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.013 [2024-04-25 23:54:01.610411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.271 #33 NEW cov: 11758 ft: 14017 corp: 21/530b lim: 30 exec/s: 33 rss: 68Mb L: 28/30 MS: 1 ChangeByte- 00:07:12.271 [2024-04-25 23:54:01.649206] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:12.271 [2024-04-25 23:54:01.649376] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:12.272 [2024-04-25 23:54:01.649529] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:12.272 [2024-04-25 23:54:01.649877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2dff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.649903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.650015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.650032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.650145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.650162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.272 #34 NEW cov: 11758 ft: 14041 corp: 22/552b lim: 30 exec/s: 34 rss: 68Mb L: 22/30 MS: 1 ChangeByte- 00:07:12.272 [2024-04-25 23:54:01.699539] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.699700] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.699850] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.700004] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d151 00:07:12.272 [2024-04-25 23:54:01.700147] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.700486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.700514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.700627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.700645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.700761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.700777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.700886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.700903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.701025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.701042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:12.272 #35 NEW cov: 11758 ft: 14072 corp: 23/582b lim: 30 exec/s: 35 rss: 68Mb L: 30/30 MS: 1 CrossOver- 00:07:12.272 [2024-04-25 23:54:01.749551] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.749719] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.749861] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:12.272 [2024-04-25 23:54:01.750004] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.750354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.750383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.750507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.750524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.750639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.750656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.750774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:752781d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.750791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.272 #36 NEW cov: 11758 ft: 14084 corp: 24/609b lim: 30 exec/s: 36 rss: 69Mb L: 27/30 MS: 1 InsertByte- 00:07:12.272 [2024-04-25 23:54:01.789598] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.789760] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.790073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.790099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.790221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.790238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.272 #37 NEW cov: 11758 ft: 14374 corp: 25/625b lim: 30 exec/s: 37 rss: 69Mb L: 16/30 MS: 1 EraseBytes- 00:07:12.272 [2024-04-25 23:54:01.829888] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.830042] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.830179] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.830321] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.830644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.830671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.830790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.830810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.830913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.830930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.831051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.831069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.272 #38 NEW cov: 11758 ft: 14388 corp: 26/649b lim: 30 exec/s: 38 rss: 69Mb L: 24/30 MS: 1 ShuffleBytes- 00:07:12.272 [2024-04-25 23:54:01.880104] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.880258] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.880415] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.880555] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.272 [2024-04-25 23:54:01.880904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.880931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.881049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.881068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.881190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.881206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.272 [2024-04-25 23:54:01.881323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.272 [2024-04-25 23:54:01.881341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.532 #39 NEW cov: 11758 ft: 14422 corp: 27/676b lim: 30 exec/s: 39 rss: 69Mb L: 27/30 MS: 1 ChangeByte- 00:07:12.532 [2024-04-25 23:54:01.920112] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.920282] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.920442] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.920597] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d10b 00:07:12.532 [2024-04-25 23:54:01.920959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.920986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:01.921103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d18118 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.921122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:01.921241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.921258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:01.921374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.921392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.532 #40 NEW cov: 11758 ft: 14426 corp: 28/700b lim: 30 exec/s: 40 rss: 69Mb L: 24/30 MS: 1 CrossOver- 00:07:12.532 [2024-04-25 23:54:01.960248] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.960427] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.960581] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.960735] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:01.961090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.961118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:01.961231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.961248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:01.961357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:18d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.961373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:01.961489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d18191 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:01.961508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.532 #46 NEW cov: 11758 ft: 14441 corp: 29/729b lim: 30 exec/s: 46 rss: 69Mb L: 29/30 MS: 1 CrossOver- 00:07:12.532 [2024-04-25 23:54:02.000330] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.000492] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.000647] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.000794] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.001122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.001148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.001256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.001273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.001383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:18d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.001401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.001514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.001528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.532 #47 NEW cov: 11758 ft: 14462 corp: 30/756b lim: 30 exec/s: 47 rss: 69Mb L: 27/30 MS: 1 EraseBytes- 00:07:12.532 [2024-04-25 23:54:02.040400] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.040569] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.040725] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:12.532 [2024-04-25 23:54:02.040882] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.041206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.041232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.041350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:83d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.041369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.041483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.041499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.041609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:752781d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.041626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.532 #48 NEW cov: 11758 ft: 14475 corp: 31/783b lim: 30 exec/s: 48 rss: 69Mb L: 27/30 MS: 1 ChangeByte- 00:07:12.532 [2024-04-25 23:54:02.090523] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.090690] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.091009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.091037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.091156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.091172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.532 #49 NEW cov: 11758 ft: 14508 corp: 32/798b lim: 30 exec/s: 49 rss: 69Mb L: 15/30 MS: 1 EraseBytes- 00:07:12.532 [2024-04-25 23:54:02.140868] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd1d1 00:07:12.532 [2024-04-25 23:54:02.141044] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.141199] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000051d1 00:07:12.532 [2024-04-25 23:54:02.141356] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.532 [2024-04-25 23:54:02.141690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d100d1 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.141716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.141838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.141855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.141985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.142006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.532 [2024-04-25 23:54:02.142127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.532 [2024-04-25 23:54:02.142144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.792 #50 NEW cov: 11758 ft: 14531 corp: 33/825b lim: 30 exec/s: 50 rss: 69Mb L: 27/30 MS: 1 ChangeByte- 00:07:12.792 [2024-04-25 23:54:02.180837] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.181017] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.181157] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.181481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.181508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.181626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.181642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.181765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.181782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.792 #51 NEW cov: 11758 ft: 14547 corp: 34/848b lim: 30 exec/s: 51 rss: 69Mb L: 23/30 MS: 1 EraseBytes- 00:07:12.792 [2024-04-25 23:54:02.220993] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.221154] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.221297] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.221454] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.221775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.221801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.221921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.221939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.222009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:18d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.222025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.222140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.222158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.792 #52 NEW cov: 11758 ft: 14577 corp: 35/875b lim: 30 exec/s: 52 rss: 69Mb L: 27/30 MS: 1 ChangeBit- 00:07:12.792 [2024-04-25 23:54:02.271259] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.271427] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d9 00:07:12.792 [2024-04-25 23:54:02.271576] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000512a 00:07:12.792 [2024-04-25 23:54:02.271728] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.272087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.272114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.272229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.272247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.272369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.272388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.272514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.272531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.792 #53 NEW cov: 11758 ft: 14585 corp: 36/901b lim: 30 exec/s: 53 rss: 70Mb L: 26/30 MS: 1 ChangeBinInt- 00:07:12.792 [2024-04-25 23:54:02.311022] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.311174] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.311525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.311553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.311663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.311681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.792 #54 NEW cov: 11758 ft: 14592 corp: 37/916b lim: 30 exec/s: 54 rss: 70Mb L: 15/30 MS: 1 ShuffleBytes- 00:07:12.792 [2024-04-25 23:54:02.351382] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.351545] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.351691] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000051d1 00:07:12.792 [2024-04-25 23:54:02.351841] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.792 [2024-04-25 23:54:02.352169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.352196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.352320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.352338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.352463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.352480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.792 [2024-04-25 23:54:02.352604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.792 [2024-04-25 23:54:02.352622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.793 #55 NEW cov: 11758 ft: 14666 corp: 38/941b lim: 30 exec/s: 55 rss: 70Mb L: 25/30 MS: 1 ShuffleBytes- 00:07:12.793 [2024-04-25 23:54:02.391503] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.793 [2024-04-25 23:54:02.391653] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (477000) > buf size (4096) 00:07:12.793 [2024-04-25 23:54:02.391928] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:12.793 [2024-04-25 23:54:02.392242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.793 [2024-04-25 23:54:02.392268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.793 [2024-04-25 23:54:02.392397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181ba cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.793 [2024-04-25 23:54:02.392415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.793 [2024-04-25 23:54:02.392533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.793 [2024-04-25 23:54:02.392553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.793 [2024-04-25 23:54:02.392669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.793 [2024-04-25 23:54:02.392688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.052 #56 NEW cov: 11798 ft: 14716 corp: 39/969b lim: 30 exec/s: 56 rss: 70Mb L: 28/30 MS: 1 ChangeBinInt- 00:07:13.052 [2024-04-25 23:54:02.441786] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xd1d1 00:07:13.052 [2024-04-25 23:54:02.441944] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:13.052 [2024-04-25 23:54:02.442082] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000051d1 00:07:13.052 [2024-04-25 23:54:02.442223] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:13.052 [2024-04-25 23:54:02.442578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d100d1 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.052 [2024-04-25 23:54:02.442605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.053 [2024-04-25 23:54:02.442725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.053 [2024-04-25 23:54:02.442743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.053 [2024-04-25 23:54:02.442865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.053 [2024-04-25 23:54:02.442883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.053 [2024-04-25 23:54:02.443000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.053 [2024-04-25 23:54:02.443017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.053 #57 NEW cov: 11798 ft: 14724 corp: 40/996b lim: 30 exec/s: 57 rss: 70Mb L: 27/30 MS: 1 ShuffleBytes- 00:07:13.053 [2024-04-25 23:54:02.491682] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1d1 00:07:13.053 [2024-04-25 23:54:02.491839] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d10b 00:07:13.053 [2024-04-25 23:54:02.492161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.053 [2024-04-25 23:54:02.492188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.053 [2024-04-25 23:54:02.492312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d1d181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.053 [2024-04-25 23:54:02.492330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.053 #58 NEW cov: 11798 ft: 14731 corp: 41/1008b lim: 30 exec/s: 29 rss: 70Mb L: 12/30 MS: 1 EraseBytes- 00:07:13.053 #58 DONE cov: 11798 ft: 14731 corp: 41/1008b lim: 30 exec/s: 29 rss: 70Mb 00:07:13.053 Done 58 runs in 2 second(s) 00:07:13.053 23:54:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:13.053 23:54:02 -- ../common.sh@72 -- # (( i++ )) 00:07:13.053 23:54:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:13.053 23:54:02 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:13.053 23:54:02 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:13.053 23:54:02 -- nvmf/run.sh@24 -- # local timen=1 00:07:13.053 23:54:02 -- nvmf/run.sh@25 -- # local core=0x1 00:07:13.053 23:54:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:13.053 23:54:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:13.053 23:54:02 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:13.053 23:54:02 -- nvmf/run.sh@29 -- # port=4402 00:07:13.053 23:54:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:13.053 23:54:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:13.053 23:54:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.053 23:54:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:13.311 [2024-04-25 23:54:02.674939] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:13.311 [2024-04-25 23:54:02.675022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471905 ] 00:07:13.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.568 [2024-04-25 23:54:02.934454] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.568 [2024-04-25 23:54:02.962153] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:13.568 [2024-04-25 23:54:02.962286] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.568 [2024-04-25 23:54:03.013915] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:13.568 [2024-04-25 23:54:03.030214] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:13.568 INFO: Running with entropic power schedule (0xFF, 100). 00:07:13.568 INFO: Seed: 1535433839 00:07:13.568 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:13.568 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:13.568 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:13.568 INFO: A corpus is not provided, starting from an empty corpus 00:07:13.568 #2 INITED exec/s: 0 rss: 59Mb 00:07:13.568 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:13.568 This may also happen if the target rejected all inputs we tried so far 00:07:13.568 [2024-04-25 23:54:03.101245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.568 [2024-04-25 23:54:03.101283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.568 [2024-04-25 23:54:03.101363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.568 [2024-04-25 23:54:03.101379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.569 [2024-04-25 23:54:03.101462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.569 [2024-04-25 23:54:03.101479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.826 NEW_FUNC[1/663]: 0x4a0900 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:13.826 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:13.826 #3 NEW cov: 11489 ft: 11480 corp: 2/25b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:07:14.085 [2024-04-25 23:54:03.441226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.441273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.441401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.441423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.441540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.441561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.441686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.441707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.085 #4 NEW cov: 11602 ft: 12686 corp: 3/58b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:14.085 [2024-04-25 23:54:03.491261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.491288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.491429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.491448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.491569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.491586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.491703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.491720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.085 #10 NEW cov: 11608 ft: 13041 corp: 4/91b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:14.085 [2024-04-25 23:54:03.541432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.541459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.541578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.541595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.541685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.541701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.541818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.541833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.085 #11 NEW cov: 11693 ft: 13255 corp: 5/125b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CopyPart- 00:07:14.085 [2024-04-25 23:54:03.591361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.591400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.591530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.591546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.085 [2024-04-25 23:54:03.591670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.085 [2024-04-25 23:54:03.591688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.086 #12 NEW cov: 11693 ft: 13398 corp: 6/149b lim: 35 exec/s: 0 rss: 67Mb L: 24/34 MS: 1 ChangeASCIIInt- 00:07:14.086 [2024-04-25 23:54:03.631455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.631481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.086 [2024-04-25 23:54:03.631601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.631617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.086 [2024-04-25 23:54:03.631739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003632 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.631754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.086 #13 NEW cov: 11693 ft: 13463 corp: 7/173b lim: 35 exec/s: 0 rss: 67Mb L: 24/34 MS: 1 ChangeBit- 00:07:14.086 [2024-04-25 23:54:03.681869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.681895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.086 [2024-04-25 23:54:03.682028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.682045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.086 [2024-04-25 23:54:03.682159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.682176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.086 [2024-04-25 23:54:03.682286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.086 [2024-04-25 23:54:03.682303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.344 #14 NEW cov: 11693 ft: 13549 corp: 8/206b lim: 35 exec/s: 0 rss: 67Mb L: 33/34 MS: 1 ChangeASCIIInt- 00:07:14.344 [2024-04-25 23:54:03.721298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0005000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.721325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.344 #23 NEW cov: 11693 ft: 14060 corp: 9/214b lim: 35 exec/s: 0 rss: 68Mb L: 8/34 MS: 4 CMP-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:07:14.344 [2024-04-25 23:54:03.762105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.762131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.344 [2024-04-25 23:54:03.762269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.762286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.344 [2024-04-25 23:54:03.762402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.762418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.344 [2024-04-25 23:54:03.762536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.762554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.344 #24 NEW cov: 11693 ft: 14095 corp: 10/247b lim: 35 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:14.344 [2024-04-25 23:54:03.812224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e8e8000a cdw11:3400e8e8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.812253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.344 [2024-04-25 23:54:03.812368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.812386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.344 [2024-04-25 23:54:03.812512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.812527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.344 [2024-04-25 23:54:03.812644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.344 [2024-04-25 23:54:03.812660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.344 #25 NEW cov: 11693 ft: 14147 corp: 11/275b lim: 35 exec/s: 0 rss: 68Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:07:14.345 [2024-04-25 23:54:03.851661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.851688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.345 #26 NEW cov: 11693 ft: 14171 corp: 12/288b lim: 35 exec/s: 0 rss: 68Mb L: 13/34 MS: 1 EraseBytes- 00:07:14.345 [2024-04-25 23:54:03.891967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.891995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.345 [2024-04-25 23:54:03.892115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.892131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.345 #27 NEW cov: 11693 ft: 14369 corp: 13/307b lim: 35 exec/s: 0 rss: 68Mb L: 19/34 MS: 1 EraseBytes- 00:07:14.345 [2024-04-25 23:54:03.932532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.932557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.345 [2024-04-25 23:54:03.932682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.932699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.345 [2024-04-25 23:54:03.932814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.932831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.345 [2024-04-25 23:54:03.932948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.345 [2024-04-25 23:54:03.932965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.345 #28 NEW cov: 11693 ft: 14421 corp: 14/336b lim: 35 exec/s: 0 rss: 68Mb L: 29/34 MS: 1 EraseBytes- 00:07:14.603 [2024-04-25 23:54:03.972743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:03.972772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:03.972889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:03.972905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:03.973026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:2e003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:03.973043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:03.973152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:03.973171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.603 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:14.603 #29 NEW cov: 11716 ft: 14501 corp: 15/370b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeBinInt- 00:07:14.603 [2024-04-25 23:54:04.022817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.022842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.022962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.022979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.023101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:00002100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.023116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.023232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.023250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.603 #30 NEW cov: 11716 ft: 14513 corp: 16/403b lim: 35 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 ChangeBinInt- 00:07:14.603 [2024-04-25 23:54:04.062981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.063008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.063140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.063157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.063278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:00002100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.063294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.063419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.063439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.603 #31 NEW cov: 11716 ft: 14569 corp: 17/436b lim: 35 exec/s: 31 rss: 68Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:14.603 [2024-04-25 23:54:04.103144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.103171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.103301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.103318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.103443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:2e003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.103461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.103574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:00003400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.103591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.603 #32 NEW cov: 11716 ft: 14612 corp: 18/470b lim: 35 exec/s: 32 rss: 68Mb L: 34/34 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:14.603 [2024-04-25 23:54:04.153243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.153271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.153400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.153418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.153538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:cccb0034 cdw11:d100cbcb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.153555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.153678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:c33400cb cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.153692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.603 #33 NEW cov: 11716 ft: 14627 corp: 19/504b lim: 35 exec/s: 33 rss: 68Mb L: 34/34 MS: 1 ChangeBinInt- 00:07:14.603 [2024-04-25 23:54:04.193127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.193154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.193272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.193291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.603 [2024-04-25 23:54:04.193421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.603 [2024-04-25 23:54:04.193442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.603 #34 NEW cov: 11716 ft: 14646 corp: 20/528b lim: 35 exec/s: 34 rss: 68Mb L: 24/34 MS: 1 ChangeBinInt- 00:07:14.862 [2024-04-25 23:54:04.233507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.233536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.233651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.233669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.233780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.233798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.233917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.233934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.862 #35 NEW cov: 11716 ft: 14652 corp: 21/561b lim: 35 exec/s: 35 rss: 68Mb L: 33/34 MS: 1 ChangeBit- 00:07:14.862 [2024-04-25 23:54:04.273619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.273645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.273772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.273791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.273904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3232005c cdw11:00002100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.273922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.274040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.274059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.862 #36 NEW cov: 11716 ft: 14665 corp: 22/594b lim: 35 exec/s: 36 rss: 69Mb L: 33/34 MS: 1 ChangeByte- 00:07:14.862 [2024-04-25 23:54:04.323770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.862 [2024-04-25 23:54:04.323797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.862 [2024-04-25 23:54:04.323915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.323932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.324055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:2e003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.324076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.324193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003357 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.324210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.863 #37 NEW cov: 11716 ft: 14668 corp: 23/628b lim: 35 exec/s: 37 rss: 69Mb L: 34/34 MS: 1 ChangeASCIIInt- 00:07:14.863 [2024-04-25 23:54:04.363405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e8e8000a cdw11:3400e8e8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.363431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.363553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.363569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.403722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e8e8000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.403749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.403872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:343400e8 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.403889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.404006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.404022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.863 #39 NEW cov: 11716 ft: 14682 corp: 24/649b lim: 35 exec/s: 39 rss: 69Mb L: 21/34 MS: 2 EraseBytes-CrossOver- 00:07:14.863 [2024-04-25 23:54:04.444141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e8e8000a cdw11:3400e8e8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.444169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.444291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:30300034 cdw11:30003030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.444308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.444435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:30300030 cdw11:32003030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.444453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.863 [2024-04-25 23:54:04.444586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:33370038 cdw11:30003133 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.863 [2024-04-25 23:54:04.444602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.863 #40 NEW cov: 11716 ft: 14690 corp: 25/677b lim: 35 exec/s: 40 rss: 69Mb L: 28/34 MS: 1 ChangeASCIIInt- 00:07:15.122 [2024-04-25 23:54:04.483752] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:15.122 [2024-04-25 23:54:04.484253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.484283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.122 [2024-04-25 23:54:04.484401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:00000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.484420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.122 [2024-04-25 23:54:04.484545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:2e000034 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.484567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.122 [2024-04-25 23:54:04.484684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.484701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.122 #41 NEW cov: 11725 ft: 14736 corp: 26/711b lim: 35 exec/s: 41 rss: 69Mb L: 34/34 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:07:15.122 [2024-04-25 23:54:04.524150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.524178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.122 [2024-04-25 23:54:04.524301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.524318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.122 [2024-04-25 23:54:04.524443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.122 [2024-04-25 23:54:04.524460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.122 #42 NEW cov: 11725 ft: 14745 corp: 27/734b lim: 35 exec/s: 42 rss: 69Mb L: 23/34 MS: 1 EraseBytes- 00:07:15.122 [2024-04-25 23:54:04.574737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:e8e8000a cdw11:3400e8e8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.574764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.574870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.574887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.575004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.575021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.575131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:7c7c007c cdw11:7c007c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.575147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.575261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.575280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.123 #43 NEW cov: 11725 ft: 14809 corp: 28/769b lim: 35 exec/s: 43 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:15.123 [2024-04-25 23:54:04.614799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.614827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.614930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.614949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.615068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34340034 cdw11:2e003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.615083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.615193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.615209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.615320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.615340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.123 #44 NEW cov: 11725 ft: 14839 corp: 29/804b lim: 35 exec/s: 44 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:15.123 [2024-04-25 23:54:04.654635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.654662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.654788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.654805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.654918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.654934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.655014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.655029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.123 #45 NEW cov: 11725 ft: 14854 corp: 30/838b lim: 35 exec/s: 45 rss: 69Mb L: 34/35 MS: 1 CrossOver- 00:07:15.123 [2024-04-25 23:54:04.694288] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:15.123 [2024-04-25 23:54:04.694797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.694824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.694945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:00000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.694964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.695079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:57000034 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.695102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.123 [2024-04-25 23:54:04.695218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.123 [2024-04-25 23:54:04.695234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.123 #46 NEW cov: 11725 ft: 14886 corp: 31/866b lim: 35 exec/s: 46 rss: 69Mb L: 28/35 MS: 1 EraseBytes- 00:07:15.383 [2024-04-25 23:54:04.734516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.734544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.734664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:32320032 cdw11:1f003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.734696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.383 #47 NEW cov: 11725 ft: 14924 corp: 32/883b lim: 35 exec/s: 47 rss: 69Mb L: 17/35 MS: 1 CMP- DE: "\037\000\000\000"- 00:07:15.383 [2024-04-25 23:54:04.775207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:30003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.775232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.775351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:30300030 cdw11:32003730 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.775368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.775488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37330035 cdw11:2e003838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.775506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.775623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.775639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.775759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.775775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.383 #48 NEW cov: 11725 ft: 14937 corp: 33/918b lim: 35 exec/s: 48 rss: 69Mb L: 35/35 MS: 1 ChangeASCIIInt- 00:07:15.383 [2024-04-25 23:54:04.825402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.825430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.825548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:cb0034cb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.825570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.825680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:cbcb00cb cdw11:2e00cbcb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.825698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.825812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:34340034 cdw11:57003457 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.825828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.383 [2024-04-25 23:54:04.825949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.383 [2024-04-25 23:54:04.825965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.384 #49 NEW cov: 11725 ft: 14953 corp: 34/953b lim: 35 exec/s: 49 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:15.384 [2024-04-25 23:54:04.864760] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:15.384 [2024-04-25 23:54:04.865145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.865177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.865296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:32320032 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.865314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.865427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:32000032 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.865448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.384 #50 NEW cov: 11725 ft: 14975 corp: 35/974b lim: 35 exec/s: 50 rss: 69Mb L: 21/35 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:15.384 [2024-04-25 23:54:04.905211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.905236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.905374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:3400340a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.905391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.905518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:34320034 cdw11:32003232 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.905535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.384 #51 NEW cov: 11725 ft: 14982 corp: 36/996b lim: 35 exec/s: 51 rss: 69Mb L: 22/35 MS: 1 CrossOver- 00:07:15.384 [2024-04-25 23:54:04.945444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:91003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.945469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.945601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003432 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.945617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.945741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:00002100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.945759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.945856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320032 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.945873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.384 #52 NEW cov: 11725 ft: 14991 corp: 37/1029b lim: 35 exec/s: 52 rss: 69Mb L: 33/35 MS: 1 ChangeByte- 00:07:15.384 [2024-04-25 23:54:04.985254] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:15.384 [2024-04-25 23:54:04.985637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3458000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.985665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.985782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:32003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.985798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.985923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:32320032 cdw11:00003221 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.985940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.384 [2024-04-25 23:54:04.986058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:32320000 cdw11:57003257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.384 [2024-04-25 23:54:04.986080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.644 #53 NEW cov: 11725 ft: 14994 corp: 38/1063b lim: 35 exec/s: 53 rss: 69Mb L: 34/35 MS: 1 InsertByte- 00:07:15.644 [2024-04-25 23:54:05.025514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:3434000a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.644 [2024-04-25 23:54:05.025541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.644 [2024-04-25 23:54:05.025662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.644 [2024-04-25 23:54:05.025681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.644 [2024-04-25 23:54:05.025792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:57570057 cdw11:57005757 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.644 [2024-04-25 23:54:05.025810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.644 #54 NEW cov: 11725 ft: 15006 corp: 39/1086b lim: 35 exec/s: 54 rss: 69Mb L: 23/35 MS: 1 ChangeBit- 00:07:15.644 [2024-04-25 23:54:05.065367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a34008a cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.644 [2024-04-25 23:54:05.065399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.644 [2024-04-25 23:54:05.065530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:34340034 cdw11:34003434 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.644 [2024-04-25 23:54:05.065547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.644 #57 NEW cov: 11725 ft: 15014 corp: 40/1100b lim: 35 exec/s: 28 rss: 69Mb L: 14/35 MS: 3 ShuffleBytes-ChangeBit-CrossOver- 00:07:15.644 #57 DONE cov: 11725 ft: 15014 corp: 40/1100b lim: 35 exec/s: 28 rss: 69Mb 00:07:15.644 ###### Recommended dictionary. ###### 00:07:15.644 "\000\000\000\000" # Uses: 1 00:07:15.644 "\003\000\000\000\000\000\000\000" # Uses: 0 00:07:15.644 "\037\000\000\000" # Uses: 0 00:07:15.644 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:15.644 ###### End of recommended dictionary. ###### 00:07:15.644 Done 57 runs in 2 second(s) 00:07:15.644 23:54:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:15.644 23:54:05 -- ../common.sh@72 -- # (( i++ )) 00:07:15.644 23:54:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:15.644 23:54:05 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:15.644 23:54:05 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:15.644 23:54:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:15.644 23:54:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:15.644 23:54:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:15.644 23:54:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:15.644 23:54:05 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:15.644 23:54:05 -- nvmf/run.sh@29 -- # port=4403 00:07:15.644 23:54:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:15.644 23:54:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:15.644 23:54:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:15.644 23:54:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:15.644 [2024-04-25 23:54:05.243676] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:15.644 [2024-04-25 23:54:05.243761] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472777 ] 00:07:15.903 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.903 [2024-04-25 23:54:05.489984] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.162 [2024-04-25 23:54:05.517787] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:16.162 [2024-04-25 23:54:05.517927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.162 [2024-04-25 23:54:05.569481] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.162 [2024-04-25 23:54:05.585768] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:16.162 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.162 INFO: Seed: 4091441155 00:07:16.162 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:16.162 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:16.162 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:16.162 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.162 #2 INITED exec/s: 0 rss: 60Mb 00:07:16.162 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:16.162 This may also happen if the target rejected all inputs we tried so far 00:07:16.421 NEW_FUNC[1/652]: 0x4a25d0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:16.421 NEW_FUNC[2/652]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:16.421 #4 NEW cov: 11412 ft: 11411 corp: 2/21b lim: 20 exec/s: 0 rss: 67Mb L: 20/20 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:16.421 #5 NEW cov: 11525 ft: 11873 corp: 3/41b lim: 20 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CopyPart- 00:07:16.421 #6 NEW cov: 11531 ft: 12186 corp: 4/61b lim: 20 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeByte- 00:07:16.680 #9 NEW cov: 11616 ft: 12900 corp: 5/65b lim: 20 exec/s: 0 rss: 67Mb L: 4/20 MS: 3 CopyPart-InsertByte-InsertByte- 00:07:16.680 #10 NEW cov: 11624 ft: 13106 corp: 6/80b lim: 20 exec/s: 0 rss: 67Mb L: 15/20 MS: 1 CrossOver- 00:07:16.680 #11 NEW cov: 11624 ft: 13239 corp: 7/84b lim: 20 exec/s: 0 rss: 68Mb L: 4/20 MS: 1 ShuffleBytes- 00:07:16.680 #12 NEW cov: 11624 ft: 13291 corp: 8/104b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBit- 00:07:16.680 #13 NEW cov: 11624 ft: 13334 corp: 9/124b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:16.680 #14 NEW cov: 11624 ft: 13523 corp: 10/144b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:16.939 #15 NEW cov: 11624 ft: 13542 corp: 11/164b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBit- 00:07:16.939 #16 NEW cov: 11624 ft: 13619 corp: 12/184b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:16.939 #17 NEW cov: 11624 ft: 13638 corp: 13/204b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:07:16.939 #18 NEW cov: 11624 ft: 13665 corp: 14/224b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeByte- 00:07:16.939 #19 NEW cov: 11624 ft: 13692 corp: 15/244b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CMP- DE: "\001w\025P0}\225\026"- 00:07:16.939 #20 NEW cov: 11624 ft: 13710 corp: 16/264b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CMP- DE: "\3275{\216U\025w\000"- 00:07:16.939 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:16.939 #26 NEW cov: 11648 ft: 13750 corp: 17/282b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 EraseBytes- 00:07:17.198 #27 NEW cov: 11648 ft: 13840 corp: 18/294b lim: 20 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 EraseBytes- 00:07:17.198 #28 NEW cov: 11648 ft: 13908 corp: 19/310b lim: 20 exec/s: 0 rss: 69Mb L: 16/20 MS: 1 CopyPart- 00:07:17.198 #29 NEW cov: 11648 ft: 13931 corp: 20/330b lim: 20 exec/s: 29 rss: 69Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:17.198 #30 NEW cov: 11648 ft: 13958 corp: 21/350b lim: 20 exec/s: 30 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:17.198 #31 NEW cov: 11648 ft: 13969 corp: 22/363b lim: 20 exec/s: 31 rss: 69Mb L: 13/20 MS: 1 InsertRepeatedBytes- 00:07:17.198 #32 NEW cov: 11648 ft: 14072 corp: 23/383b lim: 20 exec/s: 32 rss: 69Mb L: 20/20 MS: 1 PersAutoDict- DE: "\001w\025P0}\225\026"- 00:07:17.198 #33 NEW cov: 11648 ft: 14097 corp: 24/403b lim: 20 exec/s: 33 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:17.457 #34 NEW cov: 11648 ft: 14116 corp: 25/419b lim: 20 exec/s: 34 rss: 69Mb L: 16/20 MS: 1 EraseBytes- 00:07:17.457 #35 NEW cov: 11648 ft: 14133 corp: 26/439b lim: 20 exec/s: 35 rss: 69Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:17.457 #36 NEW cov: 11648 ft: 14153 corp: 27/459b lim: 20 exec/s: 36 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:17.457 #37 NEW cov: 11648 ft: 14174 corp: 28/479b lim: 20 exec/s: 37 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:17.457 #38 NEW cov: 11648 ft: 14197 corp: 29/499b lim: 20 exec/s: 38 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:17.457 [2024-04-25 23:54:07.008278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:17.457 [2024-04-25 23:54:07.008319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.457 NEW_FUNC[1/17]: 0x1157d10 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:17.457 NEW_FUNC[2/17]: 0x1158890 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:17.457 #39 NEW cov: 11890 ft: 14531 corp: 30/519b lim: 20 exec/s: 39 rss: 70Mb L: 20/20 MS: 1 CMP- DE: "\000\000\000\000\000\000\003\307"- 00:07:17.716 #40 NEW cov: 11890 ft: 14554 corp: 31/535b lim: 20 exec/s: 40 rss: 70Mb L: 16/20 MS: 1 ChangeBit- 00:07:17.716 #44 NEW cov: 11890 ft: 14557 corp: 32/552b lim: 20 exec/s: 44 rss: 70Mb L: 17/20 MS: 4 ChangeBit-CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:17.716 #45 NEW cov: 11890 ft: 14599 corp: 33/572b lim: 20 exec/s: 45 rss: 70Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:17.716 #46 NEW cov: 11890 ft: 14614 corp: 34/592b lim: 20 exec/s: 46 rss: 70Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:17.716 #47 NEW cov: 11890 ft: 14617 corp: 35/608b lim: 20 exec/s: 47 rss: 70Mb L: 16/20 MS: 1 ChangeBinInt- 00:07:17.716 #48 NEW cov: 11890 ft: 14630 corp: 36/628b lim: 20 exec/s: 48 rss: 70Mb L: 20/20 MS: 1 ChangeByte- 00:07:17.716 #50 NEW cov: 11891 ft: 14839 corp: 37/639b lim: 20 exec/s: 50 rss: 70Mb L: 11/20 MS: 2 CrossOver-PersAutoDict- DE: "\3275{\216U\025w\000"- 00:07:17.974 #51 NEW cov: 11891 ft: 14842 corp: 38/659b lim: 20 exec/s: 51 rss: 70Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:17.975 #52 NEW cov: 11891 ft: 14858 corp: 39/675b lim: 20 exec/s: 52 rss: 70Mb L: 16/20 MS: 1 ChangeBit- 00:07:17.975 #53 NEW cov: 11891 ft: 14863 corp: 40/695b lim: 20 exec/s: 53 rss: 70Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:17.975 #54 NEW cov: 11891 ft: 14897 corp: 41/715b lim: 20 exec/s: 54 rss: 70Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:17.975 #55 NEW cov: 11891 ft: 14912 corp: 42/733b lim: 20 exec/s: 55 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:07:17.975 #56 NEW cov: 11891 ft: 15022 corp: 43/753b lim: 20 exec/s: 56 rss: 70Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:18.234 #57 NEW cov: 11891 ft: 15030 corp: 44/771b lim: 20 exec/s: 57 rss: 70Mb L: 18/20 MS: 1 EraseBytes- 00:07:18.234 #58 NEW cov: 11891 ft: 15035 corp: 45/787b lim: 20 exec/s: 29 rss: 70Mb L: 16/20 MS: 1 EraseBytes- 00:07:18.234 #58 DONE cov: 11891 ft: 15035 corp: 45/787b lim: 20 exec/s: 29 rss: 70Mb 00:07:18.234 ###### Recommended dictionary. ###### 00:07:18.234 "\001w\025P0}\225\026" # Uses: 1 00:07:18.234 "\3275{\216U\025w\000" # Uses: 1 00:07:18.234 "\000\000\000\000\000\000\003\307" # Uses: 0 00:07:18.234 ###### End of recommended dictionary. ###### 00:07:18.234 Done 58 runs in 2 second(s) 00:07:18.234 23:54:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:18.234 23:54:07 -- ../common.sh@72 -- # (( i++ )) 00:07:18.234 23:54:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:18.234 23:54:07 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:18.234 23:54:07 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:18.234 23:54:07 -- nvmf/run.sh@24 -- # local timen=1 00:07:18.234 23:54:07 -- nvmf/run.sh@25 -- # local core=0x1 00:07:18.234 23:54:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:18.234 23:54:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:18.234 23:54:07 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:18.234 23:54:07 -- nvmf/run.sh@29 -- # port=4404 00:07:18.234 23:54:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:18.234 23:54:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:18.234 23:54:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:18.234 23:54:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:18.234 [2024-04-25 23:54:07.791510] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:18.234 [2024-04-25 23:54:07.791605] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473397 ] 00:07:18.234 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.493 [2024-04-25 23:54:08.042822] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.493 [2024-04-25 23:54:08.072012] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:18.493 [2024-04-25 23:54:08.072136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.752 [2024-04-25 23:54:08.123470] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:18.752 [2024-04-25 23:54:08.139759] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:18.752 INFO: Running with entropic power schedule (0xFF, 100). 00:07:18.752 INFO: Seed: 2352472387 00:07:18.752 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:18.752 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:18.752 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:18.752 INFO: A corpus is not provided, starting from an empty corpus 00:07:18.752 #2 INITED exec/s: 0 rss: 59Mb 00:07:18.752 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:18.752 This may also happen if the target rejected all inputs we tried so far 00:07:18.752 [2024-04-25 23:54:08.195102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.752 [2024-04-25 23:54:08.195131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.011 NEW_FUNC[1/664]: 0x4a36c0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:19.011 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:19.011 #4 NEW cov: 11510 ft: 11511 corp: 2/12b lim: 35 exec/s: 0 rss: 66Mb L: 11/11 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:19.011 [2024-04-25 23:54:08.506284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.506317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.011 [2024-04-25 23:54:08.506374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.506389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.011 [2024-04-25 23:54:08.506457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.506472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.011 [2024-04-25 23:54:08.506528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.506541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.011 #6 NEW cov: 11623 ft: 12793 corp: 3/44b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:19.011 [2024-04-25 23:54:08.546366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.546399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.011 [2024-04-25 23:54:08.546459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.546473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.011 [2024-04-25 23:54:08.546531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.011 [2024-04-25 23:54:08.546545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.012 [2024-04-25 23:54:08.546600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.012 [2024-04-25 23:54:08.546617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.012 #11 NEW cov: 11629 ft: 12967 corp: 4/76b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 5 ChangeBinInt-InsertByte-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:19.012 [2024-04-25 23:54:08.585945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0aff0a29 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.012 [2024-04-25 23:54:08.585971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.012 #12 NEW cov: 11714 ft: 13373 corp: 5/88b lim: 35 exec/s: 0 rss: 67Mb L: 12/32 MS: 1 InsertByte- 00:07:19.271 [2024-04-25 23:54:08.626566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.626591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.626651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.626666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.626723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.626748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.626805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.626818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.271 #13 NEW cov: 11714 ft: 13512 corp: 6/120b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 ChangeByte- 00:07:19.271 [2024-04-25 23:54:08.666160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.666186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 #14 NEW cov: 11714 ft: 13655 corp: 7/132b lim: 35 exec/s: 0 rss: 67Mb L: 12/32 MS: 1 InsertByte- 00:07:19.271 [2024-04-25 23:54:08.706779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.706805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.706862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.706875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.706931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.706945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.707014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.707028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.271 #15 NEW cov: 11714 ft: 13740 corp: 8/164b lim: 35 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CopyPart- 00:07:19.271 [2024-04-25 23:54:08.746409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.746435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 #16 NEW cov: 11714 ft: 13840 corp: 9/176b lim: 35 exec/s: 0 rss: 67Mb L: 12/32 MS: 1 ShuffleBytes- 00:07:19.271 [2024-04-25 23:54:08.786581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0aff0a29 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.786607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 #17 NEW cov: 11714 ft: 13925 corp: 10/188b lim: 35 exec/s: 0 rss: 68Mb L: 12/32 MS: 1 ShuffleBytes- 00:07:19.271 [2024-04-25 23:54:08.827135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.827161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.827220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.827234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.827292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.827306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.827364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.827378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.271 #18 NEW cov: 11714 ft: 14004 corp: 11/220b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 CrossOver- 00:07:19.271 [2024-04-25 23:54:08.867226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.867251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.867308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.867323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.867382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.867401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.271 [2024-04-25 23:54:08.867457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.271 [2024-04-25 23:54:08.867472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.531 #19 NEW cov: 11714 ft: 14050 corp: 12/252b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ShuffleBytes- 00:07:19.531 [2024-04-25 23:54:08.907378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.907410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.907469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.907482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.907536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.907549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.907605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48485048 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.907618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.531 #20 NEW cov: 11714 ft: 14104 corp: 13/284b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:19.531 [2024-04-25 23:54:08.947502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.947528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.947580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.947594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.947650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.947664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.947721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:67474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.947734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.531 #21 NEW cov: 11714 ft: 14171 corp: 14/316b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ChangeBit- 00:07:19.531 [2024-04-25 23:54:08.987613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.987638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.987695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.987709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.987754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:7a470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.987768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:08.987825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:67474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:08.987842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.531 #22 NEW cov: 11714 ft: 14200 corp: 15/348b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ChangeByte- 00:07:19.531 [2024-04-25 23:54:09.027719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.027746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:09.027803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.027817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:09.027873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.027887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:09.027946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cb564747 cdw11:7fb40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.027960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.531 #23 NEW cov: 11714 ft: 14239 corp: 16/380b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 CMP- DE: "\313V\177\264Q\025w\000"- 00:07:19.531 [2024-04-25 23:54:09.067806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.067832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:09.067890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.067903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:09.067961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.067975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.531 [2024-04-25 23:54:09.068018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.068033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.531 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:19.531 #24 NEW cov: 11737 ft: 14273 corp: 17/412b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:19.531 [2024-04-25 23:54:09.117456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.531 [2024-04-25 23:54:09.117482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 #25 NEW cov: 11737 ft: 14309 corp: 18/425b lim: 35 exec/s: 0 rss: 69Mb L: 13/32 MS: 1 CopyPart- 00:07:19.791 [2024-04-25 23:54:09.158248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.158274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.158334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.158349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.158408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.158423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.158478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48485048 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.158492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.158548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.158562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.791 #26 NEW cov: 11737 ft: 14386 corp: 19/460b lim: 35 exec/s: 26 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:07:19.791 [2024-04-25 23:54:09.198193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.198219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.198275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.198288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.198344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484821 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.198357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.198413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484850 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.198427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.791 #27 NEW cov: 11737 ft: 14400 corp: 20/493b lim: 35 exec/s: 27 rss: 69Mb L: 33/35 MS: 1 InsertByte- 00:07:19.791 [2024-04-25 23:54:09.238313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.238339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.238404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.238419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.238474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.238488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.238544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.238561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.791 #28 NEW cov: 11737 ft: 14416 corp: 21/525b lim: 35 exec/s: 28 rss: 69Mb L: 32/35 MS: 1 ChangeBit- 00:07:19.791 [2024-04-25 23:54:09.278477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.278503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.278559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00004747 cdw11:00200002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.278573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.278626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.278639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.278692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.278706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.791 #29 NEW cov: 11737 ft: 14423 corp: 22/557b lim: 35 exec/s: 29 rss: 69Mb L: 32/35 MS: 1 ChangeBinInt- 00:07:19.791 [2024-04-25 23:54:09.318042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:7fb4cb56 cdw11:51150002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.318068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 #34 NEW cov: 11737 ft: 14498 corp: 23/566b lim: 35 exec/s: 34 rss: 69Mb L: 9/35 MS: 5 ShuffleBytes-CrossOver-CopyPart-ShuffleBytes-PersAutoDict- DE: "\313V\177\264Q\025w\000"- 00:07:19.791 [2024-04-25 23:54:09.358663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.358689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.358745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.358759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.358817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.358832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.791 [2024-04-25 23:54:09.358884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.791 [2024-04-25 23:54:09.358897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.791 #35 NEW cov: 11737 ft: 14515 corp: 24/600b lim: 35 exec/s: 35 rss: 69Mb L: 34/35 MS: 1 CrossOver- 00:07:19.792 [2024-04-25 23:54:09.398340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27b4cb56 cdw11:51150002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.792 [2024-04-25 23:54:09.398365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.053 #36 NEW cov: 11737 ft: 14535 corp: 25/609b lim: 35 exec/s: 36 rss: 69Mb L: 9/35 MS: 1 ChangeByte- 00:07:20.053 [2024-04-25 23:54:09.438948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.438973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.439032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.439046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.439101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.439116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.439170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48ff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.439184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.053 #37 NEW cov: 11737 ft: 14560 corp: 26/642b lim: 35 exec/s: 37 rss: 69Mb L: 33/35 MS: 1 InsertByte- 00:07:20.053 [2024-04-25 23:54:09.479077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b8b805b8 cdw11:b8b80001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.479102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.479158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:4747b847 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.479172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.479227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.479242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.479295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cb564747 cdw11:7fb40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.479308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.053 #38 NEW cov: 11737 ft: 14584 corp: 27/674b lim: 35 exec/s: 38 rss: 70Mb L: 32/35 MS: 1 ChangeBinInt- 00:07:20.053 [2024-04-25 23:54:09.519164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.519189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.519246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.519260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.519316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.519330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.519386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:12484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.519404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.053 #39 NEW cov: 11737 ft: 14589 corp: 28/707b lim: 35 exec/s: 39 rss: 70Mb L: 33/35 MS: 1 InsertByte- 00:07:20.053 [2024-04-25 23:54:09.558973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0aff0a29 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.558999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.053 [2024-04-25 23:54:09.559058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cb56ffff cdw11:7fb40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.053 [2024-04-25 23:54:09.559072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.053 #40 NEW cov: 11737 ft: 14826 corp: 29/727b lim: 35 exec/s: 40 rss: 70Mb L: 20/35 MS: 1 PersAutoDict- DE: "\313V\177\264Q\025w\000"- 00:07:20.053 [2024-04-25 23:54:09.599407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.599432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.054 [2024-04-25 23:54:09.599491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.599505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.054 [2024-04-25 23:54:09.599559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:7a470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.599574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.054 [2024-04-25 23:54:09.599631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0c474700 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.599644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.054 #41 NEW cov: 11737 ft: 14831 corp: 30/759b lim: 35 exec/s: 41 rss: 70Mb L: 32/35 MS: 1 CMP- DE: "\000\014"- 00:07:20.054 [2024-04-25 23:54:09.639513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.639538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.054 [2024-04-25 23:54:09.639594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48bd4848 cdw11:b7b70001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.639608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.054 [2024-04-25 23:54:09.639662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.639676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.054 [2024-04-25 23:54:09.639729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.054 [2024-04-25 23:54:09.639746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.313 #42 NEW cov: 11737 ft: 14839 corp: 31/793b lim: 35 exec/s: 42 rss: 70Mb L: 34/35 MS: 1 ChangeBinInt- 00:07:20.313 [2024-04-25 23:54:09.679639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.679663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.679719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.679732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.679786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.679800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.679852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48485048 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.679865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.313 #43 NEW cov: 11737 ft: 14845 corp: 32/825b lim: 35 exec/s: 43 rss: 70Mb L: 32/35 MS: 1 ChangeByte- 00:07:20.313 [2024-04-25 23:54:09.719658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.719684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.719741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0affff0a cdw11:32ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.719755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.719812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.719826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.313 #44 NEW cov: 11737 ft: 15053 corp: 33/849b lim: 35 exec/s: 44 rss: 70Mb L: 24/35 MS: 1 CrossOver- 00:07:20.313 [2024-04-25 23:54:09.759901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.759926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.759982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:47050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.759995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.760048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:7a470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.760062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.760115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:67474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.760131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.313 #45 NEW cov: 11737 ft: 15075 corp: 34/881b lim: 35 exec/s: 45 rss: 70Mb L: 32/35 MS: 1 CMP- DE: "\005\000"- 00:07:20.313 [2024-04-25 23:54:09.800006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.800031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.800087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.800101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.800154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.800168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.800221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48485048 cdw11:28480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.800233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.313 #46 NEW cov: 11737 ft: 15090 corp: 35/913b lim: 35 exec/s: 46 rss: 70Mb L: 32/35 MS: 1 ChangeByte- 00:07:20.313 [2024-04-25 23:54:09.840115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.840140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.840176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.840190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.840242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.840256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.840310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.840323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.313 #47 NEW cov: 11737 ft: 15138 corp: 36/945b lim: 35 exec/s: 47 rss: 70Mb L: 32/35 MS: 1 ShuffleBytes- 00:07:20.313 [2024-04-25 23:54:09.879782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27b4cb56 cdw11:51050000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.879807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 #48 NEW cov: 11737 ft: 15140 corp: 37/954b lim: 35 exec/s: 48 rss: 70Mb L: 9/35 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:20.313 [2024-04-25 23:54:09.920429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.920455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.920504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.920521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.920576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48324848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.920590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.313 [2024-04-25 23:54:09.920643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:12484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.313 [2024-04-25 23:54:09.920656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.572 #49 NEW cov: 11737 ft: 15152 corp: 38/987b lim: 35 exec/s: 49 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:07:20.572 [2024-04-25 23:54:09.960499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4747f747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:09.960524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:09.960579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:4747474f cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:09.960593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:09.960647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:47474747 cdw11:7a470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:09.960661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:09.960718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0c474700 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:09.960732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.572 #50 NEW cov: 11737 ft: 15171 corp: 39/1019b lim: 35 exec/s: 50 rss: 70Mb L: 32/35 MS: 1 ChangeBit- 00:07:20.572 [2024-04-25 23:54:10.000456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.000481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.000538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0affff0a cdw11:30ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.000552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.000608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.000624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.572 #51 NEW cov: 11737 ft: 15179 corp: 40/1043b lim: 35 exec/s: 51 rss: 70Mb L: 24/35 MS: 1 ChangeASCIIInt- 00:07:20.572 [2024-04-25 23:54:10.040683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:b7480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.040709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.040765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.040797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.040852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.040866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.040921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.040935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.572 #52 NEW cov: 11737 ft: 15181 corp: 41/1075b lim: 35 exec/s: 52 rss: 70Mb L: 32/35 MS: 1 ChangeByte- 00:07:20.572 [2024-04-25 23:54:10.080859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b8b805b8 cdw11:b8b80001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.080885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.080941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:4747b847 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.080955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.081011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:07474747 cdw11:47470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.081025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.081080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:cb564747 cdw11:7fb40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.081093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.572 #53 NEW cov: 11737 ft: 15186 corp: 42/1107b lim: 35 exec/s: 53 rss: 70Mb L: 32/35 MS: 1 ChangeBit- 00:07:20.572 [2024-04-25 23:54:10.120756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.120783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.120841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.120855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.120911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.120925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.572 #54 NEW cov: 11737 ft: 15200 corp: 43/1133b lim: 35 exec/s: 54 rss: 70Mb L: 26/35 MS: 1 EraseBytes- 00:07:20.572 [2024-04-25 23:54:10.161095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:4848ff0a cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.161121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.161177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.161205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.161255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:48324848 cdw11:48480000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.161269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.572 [2024-04-25 23:54:10.161323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:12484848 cdw11:48480002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.572 [2024-04-25 23:54:10.161336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.831 #55 NEW cov: 11737 ft: 15215 corp: 44/1166b lim: 35 exec/s: 27 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:07:20.831 #55 DONE cov: 11737 ft: 15215 corp: 44/1166b lim: 35 exec/s: 27 rss: 70Mb 00:07:20.831 ###### Recommended dictionary. ###### 00:07:20.831 "\313V\177\264Q\025w\000" # Uses: 2 00:07:20.831 "\000\014" # Uses: 0 00:07:20.831 "\005\000" # Uses: 1 00:07:20.831 ###### End of recommended dictionary. ###### 00:07:20.831 Done 55 runs in 2 second(s) 00:07:20.831 23:54:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:20.831 23:54:10 -- ../common.sh@72 -- # (( i++ )) 00:07:20.831 23:54:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:20.831 23:54:10 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:20.831 23:54:10 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:20.831 23:54:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:20.831 23:54:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:20.831 23:54:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:20.831 23:54:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:20.831 23:54:10 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:20.831 23:54:10 -- nvmf/run.sh@29 -- # port=4405 00:07:20.831 23:54:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:20.831 23:54:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:20.831 23:54:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:20.831 23:54:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:20.831 [2024-04-25 23:54:10.344461] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:20.831 [2024-04-25 23:54:10.344529] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473739 ] 00:07:20.831 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.090 [2024-04-25 23:54:10.598081] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.090 [2024-04-25 23:54:10.624483] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.090 [2024-04-25 23:54:10.624607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.090 [2024-04-25 23:54:10.675969] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:21.090 [2024-04-25 23:54:10.692264] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:21.349 INFO: Running with entropic power schedule (0xFF, 100). 00:07:21.349 INFO: Seed: 608502036 00:07:21.349 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:21.349 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:21.349 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:21.349 INFO: A corpus is not provided, starting from an empty corpus 00:07:21.349 #2 INITED exec/s: 0 rss: 59Mb 00:07:21.349 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:21.349 This may also happen if the target rejected all inputs we tried so far 00:07:21.349 [2024-04-25 23:54:10.737971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.349 [2024-04-25 23:54:10.738001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.349 [2024-04-25 23:54:10.738055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.349 [2024-04-25 23:54:10.738068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.349 [2024-04-25 23:54:10.738119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.349 [2024-04-25 23:54:10.738133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.349 [2024-04-25 23:54:10.738186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.349 [2024-04-25 23:54:10.738198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.609 NEW_FUNC[1/664]: 0x4a5850 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:21.609 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:21.609 #3 NEW cov: 11521 ft: 11522 corp: 2/38b lim: 45 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:21.609 [2024-04-25 23:54:11.038817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.038849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.038906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.038920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.038973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.038988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.039039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.039052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.609 #4 NEW cov: 11634 ft: 11968 corp: 3/76b lim: 45 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 InsertByte- 00:07:21.609 [2024-04-25 23:54:11.088801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.088830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.088888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.088906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.088961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff0affff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.088975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.089029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.089042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.609 #5 NEW cov: 11640 ft: 12198 corp: 4/115b lim: 45 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 CrossOver- 00:07:21.609 [2024-04-25 23:54:11.128792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.128819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.128877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.128891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.128946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.128961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.609 #6 NEW cov: 11725 ft: 12806 corp: 5/144b lim: 45 exec/s: 0 rss: 67Mb L: 29/39 MS: 1 EraseBytes- 00:07:21.609 [2024-04-25 23:54:11.168713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.168739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.609 [2024-04-25 23:54:11.168792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.168806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.609 #8 NEW cov: 11725 ft: 13146 corp: 6/163b lim: 45 exec/s: 0 rss: 67Mb L: 19/39 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:21.609 [2024-04-25 23:54:11.208678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.609 [2024-04-25 23:54:11.208703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.868 #9 NEW cov: 11725 ft: 13985 corp: 7/176b lim: 45 exec/s: 0 rss: 67Mb L: 13/39 MS: 1 EraseBytes- 00:07:21.868 [2024-04-25 23:54:11.248817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.868 [2024-04-25 23:54:11.248843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.868 #10 NEW cov: 11725 ft: 14157 corp: 8/189b lim: 45 exec/s: 0 rss: 67Mb L: 13/39 MS: 1 ChangeByte- 00:07:21.868 [2024-04-25 23:54:11.289252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.868 [2024-04-25 23:54:11.289277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.868 [2024-04-25 23:54:11.289336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.868 [2024-04-25 23:54:11.289351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.289406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff2dffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.289420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.869 #11 NEW cov: 11725 ft: 14188 corp: 9/219b lim: 45 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 InsertByte- 00:07:21.869 [2024-04-25 23:54:11.329403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.329428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.329483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.329497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.329553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.329567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.869 #12 NEW cov: 11725 ft: 14218 corp: 10/248b lim: 45 exec/s: 0 rss: 69Mb L: 29/39 MS: 1 EraseBytes- 00:07:21.869 [2024-04-25 23:54:11.369499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d0d0ffd0 cdw11:d0d00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.369524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.369580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.369594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.369648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d0d0d0d0 cdw11:d0d00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.369662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.869 #16 NEW cov: 11725 ft: 14258 corp: 11/279b lim: 45 exec/s: 0 rss: 69Mb L: 31/39 MS: 4 ChangeByte-CrossOver-CopyPart-InsertRepeatedBytes- 00:07:21.869 [2024-04-25 23:54:11.409209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.409234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.869 #17 NEW cov: 11725 ft: 14385 corp: 12/292b lim: 45 exec/s: 0 rss: 69Mb L: 13/39 MS: 1 ChangeBit- 00:07:21.869 [2024-04-25 23:54:11.449690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.449715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.449775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.449793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.869 [2024-04-25 23:54:11.449849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff2dffff cdw11:ff300007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.869 [2024-04-25 23:54:11.449863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.869 #18 NEW cov: 11725 ft: 14489 corp: 13/322b lim: 45 exec/s: 0 rss: 69Mb L: 30/39 MS: 1 ChangeByte- 00:07:22.128 [2024-04-25 23:54:11.489614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.489639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.489695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.489709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.128 #19 NEW cov: 11725 ft: 14517 corp: 14/340b lim: 45 exec/s: 0 rss: 69Mb L: 18/39 MS: 1 EraseBytes- 00:07:22.128 [2024-04-25 23:54:11.529754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.529779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.529836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.529850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.128 #20 NEW cov: 11725 ft: 14622 corp: 15/358b lim: 45 exec/s: 0 rss: 69Mb L: 18/39 MS: 1 CopyPart- 00:07:22.128 [2024-04-25 23:54:11.569882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.569907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.569963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffdfffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.569977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.128 #21 NEW cov: 11725 ft: 14655 corp: 16/376b lim: 45 exec/s: 0 rss: 69Mb L: 18/39 MS: 1 ChangeBit- 00:07:22.128 [2024-04-25 23:54:11.610142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.610166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.610222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.610237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.610293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.610306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.128 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:22.128 #22 NEW cov: 11748 ft: 14672 corp: 17/405b lim: 45 exec/s: 0 rss: 69Mb L: 29/39 MS: 1 ChangeByte- 00:07:22.128 [2024-04-25 23:54:11.650110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.650136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.650191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.650205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.128 #23 NEW cov: 11748 ft: 14684 corp: 18/423b lim: 45 exec/s: 0 rss: 69Mb L: 18/39 MS: 1 ChangeBinInt- 00:07:22.128 [2024-04-25 23:54:11.690223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.690248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.128 [2024-04-25 23:54:11.690303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.128 [2024-04-25 23:54:11.690318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.128 #24 NEW cov: 11748 ft: 14781 corp: 19/441b lim: 45 exec/s: 0 rss: 70Mb L: 18/39 MS: 1 ShuffleBytes- 00:07:22.129 [2024-04-25 23:54:11.730660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.129 [2024-04-25 23:54:11.730686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.129 [2024-04-25 23:54:11.730741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.129 [2024-04-25 23:54:11.730755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.129 [2024-04-25 23:54:11.730808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff0affff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.129 [2024-04-25 23:54:11.730822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.129 [2024-04-25 23:54:11.730869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.129 [2024-04-25 23:54:11.730882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.388 #25 NEW cov: 11748 ft: 14818 corp: 20/480b lim: 45 exec/s: 25 rss: 70Mb L: 39/39 MS: 1 ChangeBit- 00:07:22.388 [2024-04-25 23:54:11.770650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:84ffe3ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.770676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.770734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.770749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.770802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.770817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.388 #26 NEW cov: 11748 ft: 14841 corp: 21/510b lim: 45 exec/s: 26 rss: 70Mb L: 30/39 MS: 1 InsertByte- 00:07:22.388 [2024-04-25 23:54:11.810576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.810602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.810659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0077ff3d cdw11:15530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.810674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.388 #27 NEW cov: 11748 ft: 14844 corp: 22/531b lim: 45 exec/s: 27 rss: 70Mb L: 21/39 MS: 1 CMP- DE: "\000w\025SQ\033d\232"- 00:07:22.388 [2024-04-25 23:54:11.850969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.850993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.851049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.851063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.851118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffd2ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.851131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.388 #28 NEW cov: 11748 ft: 14864 corp: 23/561b lim: 45 exec/s: 28 rss: 70Mb L: 30/39 MS: 1 InsertByte- 00:07:22.388 [2024-04-25 23:54:11.890601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.890627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.388 #29 NEW cov: 11748 ft: 14878 corp: 24/575b lim: 45 exec/s: 29 rss: 70Mb L: 14/39 MS: 1 InsertByte- 00:07:22.388 [2024-04-25 23:54:11.931223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.931248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.931303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.931317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.931372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.931386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.931442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.931456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.388 #30 NEW cov: 11748 ft: 14886 corp: 25/618b lim: 45 exec/s: 30 rss: 70Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:07:22.388 [2024-04-25 23:54:11.971033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.971063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.388 [2024-04-25 23:54:11.971118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.388 [2024-04-25 23:54:11.971133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.388 #31 NEW cov: 11748 ft: 14924 corp: 26/636b lim: 45 exec/s: 31 rss: 70Mb L: 18/43 MS: 1 ChangeBit- 00:07:22.648 [2024-04-25 23:54:12.011469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.011494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.011548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.011563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.011615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.011630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.011682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.011695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.648 #32 NEW cov: 11748 ft: 14933 corp: 27/678b lim: 45 exec/s: 32 rss: 70Mb L: 42/43 MS: 1 InsertRepeatedBytes- 00:07:22.648 [2024-04-25 23:54:12.051455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff84 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.051481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.051536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff840007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.051550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.051606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.051620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.648 #33 NEW cov: 11748 ft: 14963 corp: 28/705b lim: 45 exec/s: 33 rss: 70Mb L: 27/43 MS: 1 CrossOver- 00:07:22.648 [2024-04-25 23:54:12.091350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff080000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.091376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.091437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.091452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.648 #34 NEW cov: 11748 ft: 15038 corp: 29/724b lim: 45 exec/s: 34 rss: 70Mb L: 19/43 MS: 1 ChangeBinInt- 00:07:22.648 [2024-04-25 23:54:12.131449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.131474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.131530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.131545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.648 #35 NEW cov: 11748 ft: 15083 corp: 30/742b lim: 45 exec/s: 35 rss: 70Mb L: 18/43 MS: 1 ChangeByte- 00:07:22.648 [2024-04-25 23:54:12.171619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff280000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.171645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.171702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.171715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.648 #36 NEW cov: 11748 ft: 15092 corp: 31/760b lim: 45 exec/s: 36 rss: 70Mb L: 18/43 MS: 1 CopyPart- 00:07:22.648 [2024-04-25 23:54:12.212060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:84ffe3ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.212085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.212141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.212155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.212209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.212223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.212280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.212293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.648 #37 NEW cov: 11748 ft: 15108 corp: 32/799b lim: 45 exec/s: 37 rss: 70Mb L: 39/43 MS: 1 InsertRepeatedBytes- 00:07:22.648 [2024-04-25 23:54:12.252002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.252027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.648 [2024-04-25 23:54:12.252083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.648 [2024-04-25 23:54:12.252097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.649 [2024-04-25 23:54:12.252152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.649 [2024-04-25 23:54:12.252166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.908 #38 NEW cov: 11748 ft: 15127 corp: 33/830b lim: 45 exec/s: 38 rss: 70Mb L: 31/43 MS: 1 InsertRepeatedBytes- 00:07:22.908 [2024-04-25 23:54:12.292225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1d1dffff cdw11:1d1d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.292250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.292307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1d1d1d1d cdw11:1d1d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.292322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.292379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1d1d1d1d cdw11:1d1d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.292393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.292455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff1d1d cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.292469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.908 #39 NEW cov: 11748 ft: 15146 corp: 34/871b lim: 45 exec/s: 39 rss: 70Mb L: 41/43 MS: 1 InsertRepeatedBytes- 00:07:22.908 [2024-04-25 23:54:12.332024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.332048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.332105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.332119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.908 #40 NEW cov: 11748 ft: 15191 corp: 35/894b lim: 45 exec/s: 40 rss: 70Mb L: 23/43 MS: 1 InsertRepeatedBytes- 00:07:22.908 [2024-04-25 23:54:12.372494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff86 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.372519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.372576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.372591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.372644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.372659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.372712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.372726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.908 #41 NEW cov: 11748 ft: 15221 corp: 36/932b lim: 45 exec/s: 41 rss: 70Mb L: 38/43 MS: 1 ChangeBit- 00:07:22.908 [2024-04-25 23:54:12.412234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffdf cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.412262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.908 [2024-04-25 23:54:12.412319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.412333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.908 #42 NEW cov: 11748 ft: 15255 corp: 37/950b lim: 45 exec/s: 42 rss: 70Mb L: 18/43 MS: 1 ChangeBit- 00:07:22.908 [2024-04-25 23:54:12.452200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15530077 cdw11:511b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.452226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.908 #43 NEW cov: 11748 ft: 15275 corp: 38/964b lim: 45 exec/s: 43 rss: 70Mb L: 14/43 MS: 1 PersAutoDict- DE: "\000w\025SQ\033d\232"- 00:07:22.908 [2024-04-25 23:54:12.492298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.908 [2024-04-25 23:54:12.492323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.908 #44 NEW cov: 11748 ft: 15283 corp: 39/980b lim: 45 exec/s: 44 rss: 70Mb L: 16/43 MS: 1 CopyPart- 00:07:23.168 [2024-04-25 23:54:12.532797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d0d0ffd0 cdw11:d0d00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.532823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.532880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.532894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.532954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d0d0d0d0 cdw11:d0d00006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.532968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.168 #45 NEW cov: 11748 ft: 15321 corp: 40/1011b lim: 45 exec/s: 45 rss: 70Mb L: 31/43 MS: 1 ShuffleBytes- 00:07:23.168 [2024-04-25 23:54:12.573062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff86 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.573087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.573143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.573157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.573211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.573226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.573281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.573294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.168 #46 NEW cov: 11748 ft: 15324 corp: 41/1049b lim: 45 exec/s: 46 rss: 70Mb L: 38/43 MS: 1 ChangeBinInt- 00:07:23.168 [2024-04-25 23:54:12.612650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a05 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.612675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.168 #47 NEW cov: 11748 ft: 15339 corp: 42/1058b lim: 45 exec/s: 47 rss: 70Mb L: 9/43 MS: 1 CMP- DE: "\005\000\000\000\000\000\000\000"- 00:07:23.168 [2024-04-25 23:54:12.652983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.653008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.653064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.653078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.168 #48 NEW cov: 11748 ft: 15352 corp: 43/1076b lim: 45 exec/s: 48 rss: 70Mb L: 18/43 MS: 1 ChangeBinInt- 00:07:23.168 [2024-04-25 23:54:12.692928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15530077 cdw11:511b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.692954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.168 #49 NEW cov: 11748 ft: 15363 corp: 44/1090b lim: 45 exec/s: 49 rss: 70Mb L: 14/43 MS: 1 ChangeBit- 00:07:23.168 [2024-04-25 23:54:12.733508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:84ffe3ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.733534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.733591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.733606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.733660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fffd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.733675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.168 [2024-04-25 23:54:12.733729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.168 [2024-04-25 23:54:12.733743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.168 #50 NEW cov: 11748 ft: 15368 corp: 45/1129b lim: 45 exec/s: 25 rss: 70Mb L: 39/43 MS: 1 ChangeBit- 00:07:23.168 #50 DONE cov: 11748 ft: 15368 corp: 45/1129b lim: 45 exec/s: 25 rss: 70Mb 00:07:23.168 ###### Recommended dictionary. ###### 00:07:23.168 "\000w\025SQ\033d\232" # Uses: 1 00:07:23.168 "\005\000\000\000\000\000\000\000" # Uses: 0 00:07:23.168 ###### End of recommended dictionary. ###### 00:07:23.168 Done 50 runs in 2 second(s) 00:07:23.428 23:54:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:23.428 23:54:12 -- ../common.sh@72 -- # (( i++ )) 00:07:23.428 23:54:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:23.428 23:54:12 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:23.428 23:54:12 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:23.428 23:54:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:23.428 23:54:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:23.428 23:54:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:23.428 23:54:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:23.428 23:54:12 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:23.428 23:54:12 -- nvmf/run.sh@29 -- # port=4406 00:07:23.428 23:54:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:23.428 23:54:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:23.428 23:54:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:23.428 23:54:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:23.428 [2024-04-25 23:54:12.913536] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:23.428 [2024-04-25 23:54:12.913606] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474235 ] 00:07:23.428 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.687 [2024-04-25 23:54:13.163676] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.687 [2024-04-25 23:54:13.192192] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:23.687 [2024-04-25 23:54:13.192319] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.687 [2024-04-25 23:54:13.243786] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:23.687 [2024-04-25 23:54:13.260076] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:23.687 INFO: Running with entropic power schedule (0xFF, 100). 00:07:23.687 INFO: Seed: 3175502671 00:07:23.687 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:23.687 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:23.687 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:23.687 INFO: A corpus is not provided, starting from an empty corpus 00:07:23.687 #2 INITED exec/s: 0 rss: 59Mb 00:07:23.687 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:23.687 This may also happen if the target rejected all inputs we tried so far 00:07:23.945 [2024-04-25 23:54:13.308748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:23.945 [2024-04-25 23:54:13.308775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.204 NEW_FUNC[1/660]: 0x4a8060 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:24.204 NEW_FUNC[2/660]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.204 #3 NEW cov: 11419 ft: 11439 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:24.204 [2024-04-25 23:54:13.619493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.619526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.204 #4 NEW cov: 11533 ft: 11791 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:24.204 [2024-04-25 23:54:13.659644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.659671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.204 [2024-04-25 23:54:13.659722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.659738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.204 NEW_FUNC[1/2]: 0x1c793e0 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1151 00:07:24.204 NEW_FUNC[2/2]: 0x1c79bc0 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1055 00:07:24.204 #5 NEW cov: 11557 ft: 12254 corp: 4/9b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 CopyPart- 00:07:24.204 [2024-04-25 23:54:13.699626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f841 cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.699653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.204 #10 NEW cov: 11642 ft: 12625 corp: 5/11b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 5 ShuffleBytes-ChangeByte-CrossOver-ChangeBinInt-InsertByte- 00:07:24.204 [2024-04-25 23:54:13.739813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.739839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.204 [2024-04-25 23:54:13.739889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.739902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.204 #11 NEW cov: 11642 ft: 12715 corp: 6/15b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 ShuffleBytes- 00:07:24.204 [2024-04-25 23:54:13.779995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.780020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.204 [2024-04-25 23:54:13.780068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.204 [2024-04-25 23:54:13.780081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.204 #12 NEW cov: 11642 ft: 12788 corp: 7/20b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:07:24.462 [2024-04-25 23:54:13.820342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cfcf cdw11:00000000 00:07:24.462 [2024-04-25 23:54:13.820367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.462 [2024-04-25 23:54:13.820426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000cfcf cdw11:00000000 00:07:24.462 [2024-04-25 23:54:13.820441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.462 [2024-04-25 23:54:13.820488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.820501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.820549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.820562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.463 #13 NEW cov: 11642 ft: 13160 corp: 8/28b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:24.463 [2024-04-25 23:54:13.860446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007575 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.860471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.860520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007575 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.860536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.860584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007575 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.860597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.860644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.860657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.463 #14 NEW cov: 11642 ft: 13169 corp: 9/36b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:24.463 [2024-04-25 23:54:13.900321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.900346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.463 #15 NEW cov: 11642 ft: 13282 corp: 10/38b lim: 10 exec/s: 0 rss: 68Mb L: 2/8 MS: 1 CopyPart- 00:07:24.463 [2024-04-25 23:54:13.930499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.930524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.930575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.930589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.930637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.930651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.463 #16 NEW cov: 11642 ft: 13459 corp: 11/44b lim: 10 exec/s: 0 rss: 68Mb L: 6/8 MS: 1 InsertRepeatedBytes- 00:07:24.463 [2024-04-25 23:54:13.970760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fd5 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.970784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.970836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005d03 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.970850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.970899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005a15 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.970913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:13.970961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007700 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:13.970974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.463 #17 NEW cov: 11642 ft: 13490 corp: 12/52b lim: 10 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 CMP- DE: "\177\325]\003Z\025w\000"- 00:07:24.463 [2024-04-25 23:54:14.010512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:14.010537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.463 #18 NEW cov: 11642 ft: 13527 corp: 13/54b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:07:24.463 [2024-04-25 23:54:14.051041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fd5 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:14.051066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:14.051114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005d03 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:14.051127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:14.051176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005a15 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:14.051190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:14.051236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007700 cdw11:00000000 00:07:24.463 [2024-04-25 23:54:14.051248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.463 [2024-04-25 23:54:14.051295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.463 [2024-04-25 23:54:14.051309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.463 #19 NEW cov: 11642 ft: 13579 corp: 14/64b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 PersAutoDict- DE: "\177\325]\003Z\025w\000"- 00:07:24.722 [2024-04-25 23:54:14.091076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008000 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.091101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.722 [2024-04-25 23:54:14.091151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.091165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.722 [2024-04-25 23:54:14.091215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.091228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.722 [2024-04-25 23:54:14.091276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.091288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.722 #21 NEW cov: 11642 ft: 13604 corp: 15/73b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 2 ShuffleBytes-CMP- DE: "\200\000\000\000\000\000\000\000"- 00:07:24.722 [2024-04-25 23:54:14.130861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.130885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.722 #22 NEW cov: 11642 ft: 13637 corp: 16/76b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 EraseBytes- 00:07:24.722 [2024-04-25 23:54:14.170955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.170979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.722 #23 NEW cov: 11642 ft: 13674 corp: 17/78b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:24.722 [2024-04-25 23:54:14.201087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.201112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.722 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:24.722 #24 NEW cov: 11665 ft: 13754 corp: 18/80b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 CrossOver- 00:07:24.722 [2024-04-25 23:54:14.241161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.241186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.722 #25 NEW cov: 11665 ft: 13764 corp: 19/82b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:24.722 [2024-04-25 23:54:14.281284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f841 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.281308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.722 #26 NEW cov: 11665 ft: 13791 corp: 20/84b lim: 10 exec/s: 26 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:07:24.722 [2024-04-25 23:54:14.321412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 00:07:24.722 [2024-04-25 23:54:14.321436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 #27 NEW cov: 11665 ft: 13843 corp: 21/87b lim: 10 exec/s: 27 rss: 69Mb L: 3/10 MS: 1 CopyPart- 00:07:24.981 [2024-04-25 23:54:14.361478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a0e cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.361502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 #28 NEW cov: 11665 ft: 13870 corp: 22/89b lim: 10 exec/s: 28 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:24.981 [2024-04-25 23:54:14.401816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ab6 cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.401840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 [2024-04-25 23:54:14.401890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.401903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.981 [2024-04-25 23:54:14.401954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000ab6 cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.401967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.981 #29 NEW cov: 11665 ft: 13885 corp: 23/95b lim: 10 exec/s: 29 rss: 70Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:24.981 [2024-04-25 23:54:14.441832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.441856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 [2024-04-25 23:54:14.441907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.441920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.981 #30 NEW cov: 11665 ft: 13902 corp: 24/100b lim: 10 exec/s: 30 rss: 70Mb L: 5/10 MS: 1 ChangeBit- 00:07:24.981 [2024-04-25 23:54:14.482093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.482118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 [2024-04-25 23:54:14.482170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.482187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.981 [2024-04-25 23:54:14.482237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.482251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.981 #31 NEW cov: 11665 ft: 13924 corp: 25/107b lim: 10 exec/s: 31 rss: 70Mb L: 7/10 MS: 1 CrossOver- 00:07:24.981 [2024-04-25 23:54:14.521936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.521961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 #32 NEW cov: 11665 ft: 13985 corp: 26/110b lim: 10 exec/s: 32 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:24.981 [2024-04-25 23:54:14.562097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a27 cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.562121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.981 #33 NEW cov: 11665 ft: 14017 corp: 27/112b lim: 10 exec/s: 33 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:24.981 [2024-04-25 23:54:14.592155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003202 cdw11:00000000 00:07:24.981 [2024-04-25 23:54:14.592180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.240 #34 NEW cov: 11665 ft: 14057 corp: 28/115b lim: 10 exec/s: 34 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:25.240 [2024-04-25 23:54:14.632515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.240 [2024-04-25 23:54:14.632539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.240 [2024-04-25 23:54:14.632591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008a25 cdw11:00000000 00:07:25.240 [2024-04-25 23:54:14.632604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.240 [2024-04-25 23:54:14.632653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.240 [2024-04-25 23:54:14.632667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.240 #35 NEW cov: 11665 ft: 14062 corp: 29/122b lim: 10 exec/s: 35 rss: 70Mb L: 7/10 MS: 1 ChangeByte- 00:07:25.240 [2024-04-25 23:54:14.672400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.240 [2024-04-25 23:54:14.672424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.240 #36 NEW cov: 11665 ft: 14085 corp: 30/125b lim: 10 exec/s: 36 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:25.240 [2024-04-25 23:54:14.712606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.712631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.241 [2024-04-25 23:54:14.712681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.712694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.241 #37 NEW cov: 11665 ft: 14098 corp: 31/129b lim: 10 exec/s: 37 rss: 70Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:25.241 [2024-04-25 23:54:14.742664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.742692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.241 [2024-04-25 23:54:14.742743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.742756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.241 #39 NEW cov: 11665 ft: 14131 corp: 32/134b lim: 10 exec/s: 39 rss: 70Mb L: 5/10 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:25.241 [2024-04-25 23:54:14.772934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ab6 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.772959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.241 [2024-04-25 23:54:14.773012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b6b6 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.773026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.241 [2024-04-25 23:54:14.773078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.773093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.241 #40 NEW cov: 11665 ft: 14144 corp: 33/140b lim: 10 exec/s: 40 rss: 70Mb L: 6/10 MS: 1 ChangeBinInt- 00:07:25.241 [2024-04-25 23:54:14.812885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003202 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.812910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.241 [2024-04-25 23:54:14.812959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000202 cdw11:00000000 00:07:25.241 [2024-04-25 23:54:14.812973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.241 #41 NEW cov: 11665 ft: 14157 corp: 34/144b lim: 10 exec/s: 41 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:25.500 [2024-04-25 23:54:14.852926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.852951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 #42 NEW cov: 11665 ft: 14193 corp: 35/146b lim: 10 exec/s: 42 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:25.500 [2024-04-25 23:54:14.883473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fd5 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.883498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.883549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000dd03 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.883562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.883610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005a15 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.883624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.883672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007700 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.883685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.883733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.883750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:25.500 #43 NEW cov: 11665 ft: 14230 corp: 36/156b lim: 10 exec/s: 43 rss: 70Mb L: 10/10 MS: 1 ChangeBit- 00:07:25.500 [2024-04-25 23:54:14.923362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.923386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.923442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.923456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.923504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.923517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.500 #44 NEW cov: 11665 ft: 14235 corp: 37/163b lim: 10 exec/s: 44 rss: 70Mb L: 7/10 MS: 1 CopyPart- 00:07:25.500 [2024-04-25 23:54:14.963476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.963501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.963550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.963563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:14.963610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:14.963624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.500 #45 NEW cov: 11665 ft: 14248 corp: 38/169b lim: 10 exec/s: 45 rss: 70Mb L: 6/10 MS: 1 CopyPart- 00:07:25.500 [2024-04-25 23:54:15.003380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4a cdw11:00000000 00:07:25.500 [2024-04-25 23:54:15.003411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 #46 NEW cov: 11665 ft: 14267 corp: 39/171b lim: 10 exec/s: 46 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:25.500 [2024-04-25 23:54:15.043493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000071f5 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:15.043520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 #47 NEW cov: 11665 ft: 14276 corp: 40/173b lim: 10 exec/s: 47 rss: 70Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:25.500 [2024-04-25 23:54:15.073893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008000 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:15.073919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:15.073968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:15.073981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:15.074031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:15.074045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.500 [2024-04-25 23:54:15.074096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.500 [2024-04-25 23:54:15.074109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.500 #48 NEW cov: 11665 ft: 14290 corp: 41/182b lim: 10 exec/s: 48 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:25.759 [2024-04-25 23:54:15.113713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.759 [2024-04-25 23:54:15.113738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.759 #49 NEW cov: 11665 ft: 14291 corp: 42/184b lim: 10 exec/s: 49 rss: 70Mb L: 2/10 MS: 1 EraseBytes- 00:07:25.760 [2024-04-25 23:54:15.154122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.154147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.154199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.154212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.154263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.154276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.154326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.154337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.760 #50 NEW cov: 11665 ft: 14295 corp: 43/193b lim: 10 exec/s: 50 rss: 70Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:25.760 [2024-04-25 23:54:15.193904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f841 cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.193929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.760 #51 NEW cov: 11665 ft: 14298 corp: 44/195b lim: 10 exec/s: 51 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:25.760 [2024-04-25 23:54:15.224144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d2d2 cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.224170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.224221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d2f8 cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.224235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.760 #52 NEW cov: 11665 ft: 14304 corp: 45/200b lim: 10 exec/s: 52 rss: 70Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:25.760 [2024-04-25 23:54:15.264564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7f cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.264588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.264638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d55d cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.264651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.264698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000035a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.264714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.264762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00001577 cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.264775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.264825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.264839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:25.760 #53 NEW cov: 11665 ft: 14316 corp: 46/210b lim: 10 exec/s: 53 rss: 70Mb L: 10/10 MS: 1 PersAutoDict- DE: "\177\325]\003Z\025w\000"- 00:07:25.760 [2024-04-25 23:54:15.304571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.304596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.304646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.304659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.304706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.304720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.760 [2024-04-25 23:54:15.304768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:25.760 [2024-04-25 23:54:15.304781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.760 #54 NEW cov: 11665 ft: 14321 corp: 47/218b lim: 10 exec/s: 27 rss: 70Mb L: 8/10 MS: 1 CopyPart- 00:07:25.760 #54 DONE cov: 11665 ft: 14321 corp: 47/218b lim: 10 exec/s: 27 rss: 70Mb 00:07:25.760 ###### Recommended dictionary. ###### 00:07:25.760 "\177\325]\003Z\025w\000" # Uses: 2 00:07:25.760 "\200\000\000\000\000\000\000\000" # Uses: 0 00:07:25.760 ###### End of recommended dictionary. ###### 00:07:25.760 Done 54 runs in 2 second(s) 00:07:26.019 23:54:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:26.019 23:54:15 -- ../common.sh@72 -- # (( i++ )) 00:07:26.019 23:54:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.019 23:54:15 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:26.019 23:54:15 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:26.019 23:54:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.019 23:54:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.019 23:54:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:26.019 23:54:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:26.019 23:54:15 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:26.019 23:54:15 -- nvmf/run.sh@29 -- # port=4407 00:07:26.019 23:54:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:26.019 23:54:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:26.019 23:54:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.019 23:54:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:26.019 [2024-04-25 23:54:15.476556] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:26.019 [2024-04-25 23:54:15.476638] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474778 ] 00:07:26.019 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.278 [2024-04-25 23:54:15.725513] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.278 [2024-04-25 23:54:15.753786] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.278 [2024-04-25 23:54:15.753908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.278 [2024-04-25 23:54:15.805529] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:26.278 [2024-04-25 23:54:15.821823] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:26.278 INFO: Running with entropic power schedule (0xFF, 100). 00:07:26.278 INFO: Seed: 1442530534 00:07:26.278 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:26.278 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:26.278 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:26.278 INFO: A corpus is not provided, starting from an empty corpus 00:07:26.278 #2 INITED exec/s: 0 rss: 59Mb 00:07:26.278 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:26.278 This may also happen if the target rejected all inputs we tried so far 00:07:26.278 [2024-04-25 23:54:15.870720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.278 [2024-04-25 23:54:15.870747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.796 NEW_FUNC[1/662]: 0x4a8a50 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:26.796 NEW_FUNC[2/662]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:26.796 #3 NEW cov: 11438 ft: 11437 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:26.796 [2024-04-25 23:54:16.181501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.796 [2024-04-25 23:54:16.181534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.796 #7 NEW cov: 11551 ft: 11840 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 4 EraseBytes-CopyPart-CrossOver-CrossOver- 00:07:26.796 [2024-04-25 23:54:16.221532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.796 [2024-04-25 23:54:16.221557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.796 #8 NEW cov: 11557 ft: 12051 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:26.796 [2024-04-25 23:54:16.261670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000890a cdw11:00000000 00:07:26.796 [2024-04-25 23:54:16.261696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.796 #9 NEW cov: 11642 ft: 12277 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:26.796 [2024-04-25 23:54:16.301844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.796 [2024-04-25 23:54:16.301869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.796 [2024-04-25 23:54:16.301920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.796 [2024-04-25 23:54:16.301933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.797 #10 NEW cov: 11642 ft: 12538 corp: 6/13b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 CopyPart- 00:07:26.797 [2024-04-25 23:54:16.341901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000090a cdw11:00000000 00:07:26.797 [2024-04-25 23:54:16.341926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.797 #11 NEW cov: 11642 ft: 12752 corp: 7/15b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 ChangeBit- 00:07:26.797 [2024-04-25 23:54:16.382105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.797 [2024-04-25 23:54:16.382130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.797 [2024-04-25 23:54:16.382182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:26.797 [2024-04-25 23:54:16.382198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.797 #12 NEW cov: 11642 ft: 12822 corp: 8/19b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 CopyPart- 00:07:27.055 [2024-04-25 23:54:16.422271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:27.055 [2024-04-25 23:54:16.422298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.055 [2024-04-25 23:54:16.422350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.422366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.056 #13 NEW cov: 11642 ft: 12860 corp: 9/23b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 ChangeByte- 00:07:27.056 [2024-04-25 23:54:16.462325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.462350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.056 #14 NEW cov: 11642 ft: 12883 corp: 10/26b lim: 10 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:27.056 [2024-04-25 23:54:16.502359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aca cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.502384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.056 #16 NEW cov: 11642 ft: 13008 corp: 11/29b lim: 10 exec/s: 0 rss: 68Mb L: 3/4 MS: 2 ShuffleBytes-CrossOver- 00:07:27.056 [2024-04-25 23:54:16.532557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.532580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.056 [2024-04-25 23:54:16.532598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a40 cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.532609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.056 [2024-04-25 23:54:16.532625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.532635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.056 #17 NEW cov: 11642 ft: 13189 corp: 12/36b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 CopyPart- 00:07:27.056 [2024-04-25 23:54:16.572610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.572635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.056 #18 NEW cov: 11642 ft: 13223 corp: 13/38b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ChangeBit- 00:07:27.056 [2024-04-25 23:54:16.602658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aca cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.602683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.056 #19 NEW cov: 11642 ft: 13236 corp: 14/41b lim: 10 exec/s: 0 rss: 69Mb L: 3/7 MS: 1 CopyPart- 00:07:27.056 [2024-04-25 23:54:16.642903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007676 cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.642928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.056 [2024-04-25 23:54:16.642979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000760a cdw11:00000000 00:07:27.056 [2024-04-25 23:54:16.642992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.056 #20 NEW cov: 11642 ft: 13237 corp: 15/45b lim: 10 exec/s: 0 rss: 69Mb L: 4/7 MS: 1 InsertRepeatedBytes- 00:07:27.315 [2024-04-25 23:54:16.682880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000080a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.682906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 #21 NEW cov: 11642 ft: 13274 corp: 16/47b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ChangeBit- 00:07:27.315 [2024-04-25 23:54:16.723050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000890a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.723075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 #22 NEW cov: 11642 ft: 13279 corp: 17/49b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ShuffleBytes- 00:07:27.315 [2024-04-25 23:54:16.753338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.753363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.753420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.753434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.753484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.753498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.315 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:27.315 #23 NEW cov: 11665 ft: 13393 corp: 18/56b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 ChangeBit- 00:07:27.315 [2024-04-25 23:54:16.793217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.793241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 #24 NEW cov: 11665 ft: 13419 corp: 19/58b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ChangeBit- 00:07:27.315 [2024-04-25 23:54:16.833686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000040c7 cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.833711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.833762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.833775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.833830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.833844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.833895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.833908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.315 #25 NEW cov: 11665 ft: 13658 corp: 20/66b lim: 10 exec/s: 25 rss: 69Mb L: 8/8 MS: 1 InsertByte- 00:07:27.315 [2024-04-25 23:54:16.873478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.873503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 #26 NEW cov: 11665 ft: 13763 corp: 21/68b lim: 10 exec/s: 26 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:07:27.315 [2024-04-25 23:54:16.903778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.903803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.903854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.903867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.315 [2024-04-25 23:54:16.903919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.315 [2024-04-25 23:54:16.903932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.574 #27 NEW cov: 11665 ft: 13769 corp: 22/74b lim: 10 exec/s: 27 rss: 69Mb L: 6/8 MS: 1 CrossOver- 00:07:27.574 [2024-04-25 23:54:16.943668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aca cdw11:00000000 00:07:27.575 [2024-04-25 23:54:16.943693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 #28 NEW cov: 11665 ft: 13797 corp: 23/76b lim: 10 exec/s: 28 rss: 69Mb L: 2/8 MS: 1 CrossOver- 00:07:27.575 [2024-04-25 23:54:16.974069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:16.974094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:16.974145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:16.974158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:16.974209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c700 cdw11:00000000 00:07:27.575 [2024-04-25 23:54:16.974223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:16.974273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000400a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:16.974287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.575 #29 NEW cov: 11665 ft: 13810 corp: 24/84b lim: 10 exec/s: 29 rss: 70Mb L: 8/8 MS: 1 ShuffleBytes- 00:07:27.575 [2024-04-25 23:54:17.014003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.014030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:17.014081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.014094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.575 #30 NEW cov: 11665 ft: 13820 corp: 25/89b lim: 10 exec/s: 30 rss: 70Mb L: 5/8 MS: 1 CrossOver- 00:07:27.575 [2024-04-25 23:54:17.043968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.043991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 #32 NEW cov: 11665 ft: 13833 corp: 26/92b lim: 10 exec/s: 32 rss: 70Mb L: 3/8 MS: 2 EraseBytes-CrossOver- 00:07:27.575 [2024-04-25 23:54:17.084165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2f cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.084189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:17.084242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ca0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.084254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.575 #33 NEW cov: 11665 ft: 13893 corp: 27/96b lim: 10 exec/s: 33 rss: 70Mb L: 4/8 MS: 1 InsertByte- 00:07:27.575 [2024-04-25 23:54:17.124327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.124351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:17.124404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a07 cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.124418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.575 #34 NEW cov: 11665 ft: 13926 corp: 28/100b lim: 10 exec/s: 34 rss: 70Mb L: 4/8 MS: 1 ChangeBinInt- 00:07:27.575 [2024-04-25 23:54:17.164411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.164435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.575 [2024-04-25 23:54:17.164482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004aca cdw11:00000000 00:07:27.575 [2024-04-25 23:54:17.164495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.834 #35 NEW cov: 11665 ft: 13942 corp: 29/104b lim: 10 exec/s: 35 rss: 70Mb L: 4/8 MS: 1 InsertByte- 00:07:27.834 [2024-04-25 23:54:17.204458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.204482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.834 #36 NEW cov: 11665 ft: 13945 corp: 30/107b lim: 10 exec/s: 36 rss: 70Mb L: 3/8 MS: 1 ChangeBit- 00:07:27.834 [2024-04-25 23:54:17.244874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005bf2 cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.244898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.244948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002620 cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.244961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.245013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006037 cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.245026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.245075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.245088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.834 #40 NEW cov: 11665 ft: 13988 corp: 31/116b lim: 10 exec/s: 40 rss: 70Mb L: 9/9 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-CMP- DE: "\362& `7\177\000\000"- 00:07:27.834 [2024-04-25 23:54:17.284922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000219b cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.284947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.284999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009b9b cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.285014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.285062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009b9b cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.285075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.834 #43 NEW cov: 11665 ft: 14005 corp: 32/123b lim: 10 exec/s: 43 rss: 70Mb L: 7/9 MS: 3 ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:07:27.834 [2024-04-25 23:54:17.324903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007676 cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.324927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.324977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f60a cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.324991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.834 #44 NEW cov: 11665 ft: 14025 corp: 33/127b lim: 10 exec/s: 44 rss: 70Mb L: 4/9 MS: 1 ChangeBit- 00:07:27.834 [2024-04-25 23:54:17.365051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.365077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.834 [2024-04-25 23:54:17.365127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001a0a cdw11:00000000 00:07:27.834 [2024-04-25 23:54:17.365141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.834 #45 NEW cov: 11665 ft: 14052 corp: 34/131b lim: 10 exec/s: 45 rss: 70Mb L: 4/9 MS: 1 ChangeBit- 00:07:27.834 [2024-04-25 23:54:17.405145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:27.835 [2024-04-25 23:54:17.405170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.835 [2024-04-25 23:54:17.405222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.835 [2024-04-25 23:54:17.405235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.835 #46 NEW cov: 11665 ft: 14066 corp: 35/135b lim: 10 exec/s: 46 rss: 70Mb L: 4/9 MS: 1 ChangeBit- 00:07:27.835 [2024-04-25 23:54:17.435325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000021ff cdw11:00000000 00:07:27.835 [2024-04-25 23:54:17.435353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.835 [2024-04-25 23:54:17.435405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff9b cdw11:00000000 00:07:27.835 [2024-04-25 23:54:17.435419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.835 [2024-04-25 23:54:17.435467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009b9b cdw11:00000000 00:07:27.835 [2024-04-25 23:54:17.435480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.094 #47 NEW cov: 11665 ft: 14082 corp: 36/142b lim: 10 exec/s: 47 rss: 70Mb L: 7/9 MS: 1 CMP- DE: "\377\377"- 00:07:28.094 [2024-04-25 23:54:17.475466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b8f5 cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.475491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.475541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f5bf cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.475554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.475606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.475619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.094 #48 NEW cov: 11665 ft: 14092 corp: 37/149b lim: 10 exec/s: 48 rss: 70Mb L: 7/9 MS: 1 ChangeBinInt- 00:07:28.094 [2024-04-25 23:54:17.515711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.515735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.515788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.515801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.515852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.515866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.515914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.515927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.094 #49 NEW cov: 11665 ft: 14125 corp: 38/157b lim: 10 exec/s: 49 rss: 70Mb L: 8/9 MS: 1 CrossOver- 00:07:28.094 [2024-04-25 23:54:17.555616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2f cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.555640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.555692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ca0b cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.555706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.094 #50 NEW cov: 11665 ft: 14133 corp: 39/161b lim: 10 exec/s: 50 rss: 70Mb L: 4/9 MS: 1 ChangeBit- 00:07:28.094 [2024-04-25 23:54:17.595527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.595553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.094 #51 NEW cov: 11665 ft: 14139 corp: 40/163b lim: 10 exec/s: 51 rss: 70Mb L: 2/9 MS: 1 EraseBytes- 00:07:28.094 [2024-04-25 23:54:17.626106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.626130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.626179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002620 cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.626193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.094 [2024-04-25 23:54:17.626242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006037 cdw11:00000000 00:07:28.094 [2024-04-25 23:54:17.626256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.095 [2024-04-25 23:54:17.626305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:28.095 [2024-04-25 23:54:17.626318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.095 [2024-04-25 23:54:17.626367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 00:07:28.095 [2024-04-25 23:54:17.626381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:28.095 #52 NEW cov: 11665 ft: 14181 corp: 41/173b lim: 10 exec/s: 52 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:28.095 [2024-04-25 23:54:17.665872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000450a cdw11:00000000 00:07:28.095 [2024-04-25 23:54:17.665897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.095 [2024-04-25 23:54:17.665949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.095 [2024-04-25 23:54:17.665963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.095 #53 NEW cov: 11665 ft: 14259 corp: 42/178b lim: 10 exec/s: 53 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:07:28.354 [2024-04-25 23:54:17.705858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aca cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.705883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.354 #54 NEW cov: 11665 ft: 14268 corp: 43/181b lim: 10 exec/s: 54 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:28.354 [2024-04-25 23:54:17.746169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.746193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.354 [2024-04-25 23:54:17.746244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a4a4 cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.746257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.354 [2024-04-25 23:54:17.746307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a44a cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.746321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.354 #55 NEW cov: 11665 ft: 14331 corp: 44/188b lim: 10 exec/s: 55 rss: 70Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:28.354 [2024-04-25 23:54:17.786062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000400a cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.786087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.354 #56 NEW cov: 11665 ft: 14421 corp: 45/191b lim: 10 exec/s: 56 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:28.354 [2024-04-25 23:54:17.826153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aca cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.826177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.354 #58 NEW cov: 11665 ft: 14423 corp: 46/194b lim: 10 exec/s: 58 rss: 70Mb L: 3/10 MS: 2 EraseBytes-CrossOver- 00:07:28.354 [2024-04-25 23:54:17.866276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:28.354 [2024-04-25 23:54:17.866301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.354 #59 NEW cov: 11665 ft: 14468 corp: 47/196b lim: 10 exec/s: 29 rss: 70Mb L: 2/10 MS: 1 CrossOver- 00:07:28.354 #59 DONE cov: 11665 ft: 14468 corp: 47/196b lim: 10 exec/s: 29 rss: 70Mb 00:07:28.354 ###### Recommended dictionary. ###### 00:07:28.354 "\362& `7\177\000\000" # Uses: 0 00:07:28.354 "\377\377" # Uses: 0 00:07:28.354 ###### End of recommended dictionary. ###### 00:07:28.354 Done 59 runs in 2 second(s) 00:07:28.613 23:54:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:28.613 23:54:17 -- ../common.sh@72 -- # (( i++ )) 00:07:28.613 23:54:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.613 23:54:17 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:28.613 23:54:17 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:28.613 23:54:17 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.613 23:54:17 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.613 23:54:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:28.613 23:54:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:28.613 23:54:17 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:28.613 23:54:18 -- nvmf/run.sh@29 -- # port=4408 00:07:28.614 23:54:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:28.614 23:54:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:28.614 23:54:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.614 23:54:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:28.614 [2024-04-25 23:54:18.040864] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:28.614 [2024-04-25 23:54:18.040957] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475110 ] 00:07:28.614 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.614 [2024-04-25 23:54:18.221045] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.873 [2024-04-25 23:54:18.240703] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.873 [2024-04-25 23:54:18.240825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.873 [2024-04-25 23:54:18.292309] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:28.873 [2024-04-25 23:54:18.308612] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:28.873 INFO: Running with entropic power schedule (0xFF, 100). 00:07:28.873 INFO: Seed: 3929560215 00:07:28.873 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:28.873 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:28.873 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:28.873 INFO: A corpus is not provided, starting from an empty corpus 00:07:28.873 [2024-04-25 23:54:18.357849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.873 [2024-04-25 23:54:18.357879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.873 #2 INITED cov: 11456 ft: 11467 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:28.873 [2024-04-25 23:54:18.388000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.873 [2024-04-25 23:54:18.388026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.873 [2024-04-25 23:54:18.388082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.873 [2024-04-25 23:54:18.388096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.873 #3 NEW cov: 11579 ft: 12749 corp: 2/3b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:28.873 [2024-04-25 23:54:18.428100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.873 [2024-04-25 23:54:18.428127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.873 [2024-04-25 23:54:18.428186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.873 [2024-04-25 23:54:18.428200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.873 #4 NEW cov: 11585 ft: 12915 corp: 3/5b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:28.873 [2024-04-25 23:54:18.467992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.873 [2024-04-25 23:54:18.468018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.135 #5 NEW cov: 11670 ft: 13174 corp: 4/6b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 EraseBytes- 00:07:29.135 [2024-04-25 23:54:18.508460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.508486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.135 [2024-04-25 23:54:18.508546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.508560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.135 [2024-04-25 23:54:18.508618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.508633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.135 #6 NEW cov: 11670 ft: 13495 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 InsertByte- 00:07:29.135 [2024-04-25 23:54:18.548324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.548354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.135 #7 NEW cov: 11670 ft: 13593 corp: 6/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/3 MS: 1 EraseBytes- 00:07:29.135 [2024-04-25 23:54:18.578526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.578551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.135 [2024-04-25 23:54:18.578610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.578625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.135 #8 NEW cov: 11670 ft: 13649 corp: 7/12b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 InsertByte- 00:07:29.135 [2024-04-25 23:54:18.618673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.618697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.135 [2024-04-25 23:54:18.618753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.618768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.135 #9 NEW cov: 11670 ft: 13681 corp: 8/14b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeBit- 00:07:29.135 [2024-04-25 23:54:18.659272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.659297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.135 [2024-04-25 23:54:18.659353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.135 [2024-04-25 23:54:18.659368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.136 [2024-04-25 23:54:18.659437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.136 [2024-04-25 23:54:18.659452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.136 [2024-04-25 23:54:18.659507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.136 [2024-04-25 23:54:18.659521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.136 [2024-04-25 23:54:18.659576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.136 [2024-04-25 23:54:18.659590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:29.136 #10 NEW cov: 11670 ft: 14011 corp: 9/19b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:29.136 [2024-04-25 23:54:18.698869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.136 [2024-04-25 23:54:18.698894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.136 [2024-04-25 23:54:18.698952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.136 [2024-04-25 23:54:18.698970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.136 #11 NEW cov: 11670 ft: 14088 corp: 10/21b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:29.136 [2024-04-25 23:54:18.738895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.136 [2024-04-25 23:54:18.738920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.413 #12 NEW cov: 11670 ft: 14107 corp: 11/22b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:29.413 [2024-04-25 23:54:18.779149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.779173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.779230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.779245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.413 #13 NEW cov: 11670 ft: 14146 corp: 12/24b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:29.413 [2024-04-25 23:54:18.819533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.819558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.819615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.819629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.819685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.819700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.819771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.819785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.413 #14 NEW cov: 11670 ft: 14174 corp: 13/28b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 InsertByte- 00:07:29.413 [2024-04-25 23:54:18.859685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.859710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.859766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.859780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.859835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.859849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.859911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.859925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.413 #15 NEW cov: 11670 ft: 14213 corp: 14/32b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:29.413 [2024-04-25 23:54:18.899337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.899362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.413 #16 NEW cov: 11670 ft: 14244 corp: 15/33b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:29.413 [2024-04-25 23:54:18.929944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.929969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.930026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.930040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.413 [2024-04-25 23:54:18.930098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.413 [2024-04-25 23:54:18.930111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.414 [2024-04-25 23:54:18.930166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:18.930180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.414 #17 NEW cov: 11670 ft: 14277 corp: 16/37b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CopyPart- 00:07:29.414 [2024-04-25 23:54:18.970196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:18.970222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 [2024-04-25 23:54:18.970277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:18.970291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.414 [2024-04-25 23:54:18.970346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:18.970360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.414 [2024-04-25 23:54:18.970415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:18.970428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.414 [2024-04-25 23:54:18.970487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:18.970504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:29.414 #18 NEW cov: 11670 ft: 14311 corp: 17/42b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeBit- 00:07:29.414 [2024-04-25 23:54:19.009874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:19.009898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.414 [2024-04-25 23:54:19.009959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.414 [2024-04-25 23:54:19.009973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.679 #19 NEW cov: 11670 ft: 14324 corp: 18/44b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:29.679 [2024-04-25 23:54:19.050258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.050283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.050340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.050354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.050405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.050419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.050474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.050488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.679 #20 NEW cov: 11670 ft: 14347 corp: 19/48b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:29.679 [2024-04-25 23:54:19.089920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.089944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.679 #21 NEW cov: 11670 ft: 14480 corp: 20/49b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 EraseBytes- 00:07:29.679 [2024-04-25 23:54:19.130497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.130521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.130578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.130592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.130647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.130662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.130711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.130724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.679 #22 NEW cov: 11670 ft: 14503 corp: 21/53b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:29.679 [2024-04-25 23:54:19.170630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.170655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.170713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.170728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.170782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.170796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.170841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.170854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.679 #23 NEW cov: 11670 ft: 14532 corp: 22/57b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 CrossOver- 00:07:29.679 [2024-04-25 23:54:19.210895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.210920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.210977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.210991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.211048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.211062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.211117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.211131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.211190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.211203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:29.679 #24 NEW cov: 11670 ft: 14538 corp: 23/62b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertByte- 00:07:29.679 [2024-04-25 23:54:19.250830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.250856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.250915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.250930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.250984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.250998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.679 [2024-04-25 23:54:19.251052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.679 [2024-04-25 23:54:19.251066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.938 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:29.938 #25 NEW cov: 11693 ft: 14571 corp: 24/66b lim: 5 exec/s: 25 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:30.197 [2024-04-25 23:54:19.551399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.551432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.551495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.551510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.197 #26 NEW cov: 11693 ft: 14597 corp: 25/68b lim: 5 exec/s: 26 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:30.197 [2024-04-25 23:54:19.591433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.591459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.591520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.591534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.197 #27 NEW cov: 11693 ft: 14618 corp: 26/70b lim: 5 exec/s: 27 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:07:30.197 [2024-04-25 23:54:19.631703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.631729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.631790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.631804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.631864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.631878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.197 #28 NEW cov: 11693 ft: 14626 corp: 27/73b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:30.197 [2024-04-25 23:54:19.671690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.671717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.671774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.671788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.197 #29 NEW cov: 11693 ft: 14641 corp: 28/75b lim: 5 exec/s: 29 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:30.197 [2024-04-25 23:54:19.711584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.711610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 #30 NEW cov: 11693 ft: 14649 corp: 29/76b lim: 5 exec/s: 30 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:30.197 [2024-04-25 23:54:19.751878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.751904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.751960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.751975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.197 #31 NEW cov: 11693 ft: 14676 corp: 30/78b lim: 5 exec/s: 31 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:30.197 [2024-04-25 23:54:19.792283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.792308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.792365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.792379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.792441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.792455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.197 [2024-04-25 23:54:19.792512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.197 [2024-04-25 23:54:19.792526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.457 #32 NEW cov: 11693 ft: 14686 corp: 31/82b lim: 5 exec/s: 32 rss: 70Mb L: 4/5 MS: 1 CMP- DE: "\377\377"- 00:07:30.457 [2024-04-25 23:54:19.832609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.832634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.832691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.832708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.832763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.832777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.832835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.832848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.832907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.832922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.457 #33 NEW cov: 11693 ft: 14694 corp: 32/87b lim: 5 exec/s: 33 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:30.457 [2024-04-25 23:54:19.872078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.872103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 #34 NEW cov: 11693 ft: 14709 corp: 33/88b lim: 5 exec/s: 34 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:30.457 [2024-04-25 23:54:19.902824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.902848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.902908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.902922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.902979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.902993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.903046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.903060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.903117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.903132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.457 #35 NEW cov: 11693 ft: 14714 corp: 34/93b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:30.457 [2024-04-25 23:54:19.942463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.942488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.942549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.942565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.457 #36 NEW cov: 11693 ft: 14736 corp: 35/95b lim: 5 exec/s: 36 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:30.457 [2024-04-25 23:54:19.982537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.982562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:19.982620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:19.982634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.457 #37 NEW cov: 11693 ft: 14744 corp: 36/97b lim: 5 exec/s: 37 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:30.457 [2024-04-25 23:54:20.022694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:20.022720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:20.022779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:20.022793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.457 #38 NEW cov: 11693 ft: 14747 corp: 37/99b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:30.457 [2024-04-25 23:54:20.063180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:20.063205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:20.063260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:20.063275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:20.063332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:20.063347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.457 [2024-04-25 23:54:20.063405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.457 [2024-04-25 23:54:20.063420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.731 #39 NEW cov: 11693 ft: 14755 corp: 38/103b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 PersAutoDict- DE: "\377\377"- 00:07:30.731 [2024-04-25 23:54:20.103401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.103427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.103481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.103496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.103557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.103572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.103633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.103647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.103706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.103720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.731 #40 NEW cov: 11693 ft: 14760 corp: 39/108b lim: 5 exec/s: 40 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:30.731 [2024-04-25 23:54:20.143315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.143340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.143404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.143434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.143496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.143510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.143572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.143586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.731 #41 NEW cov: 11693 ft: 14786 corp: 40/112b lim: 5 exec/s: 41 rss: 70Mb L: 4/5 MS: 1 CMP- DE: "\377\001\000\000"- 00:07:30.731 [2024-04-25 23:54:20.182960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.182984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.731 #42 NEW cov: 11693 ft: 14859 corp: 41/113b lim: 5 exec/s: 42 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:30.731 [2024-04-25 23:54:20.223412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.223438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.223499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.223512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.223571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.223588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.731 #43 NEW cov: 11693 ft: 14899 corp: 42/116b lim: 5 exec/s: 43 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:30.731 [2024-04-25 23:54:20.263901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.263927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.263986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.264000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.264059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.264074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.264131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.264145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.264203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.264217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:30.731 #44 NEW cov: 11693 ft: 14900 corp: 43/121b lim: 5 exec/s: 44 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\004\000"- 00:07:30.731 [2024-04-25 23:54:20.303707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.303732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.303790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.303804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.731 [2024-04-25 23:54:20.303864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.731 [2024-04-25 23:54:20.303878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.731 #45 NEW cov: 11693 ft: 14904 corp: 44/124b lim: 5 exec/s: 45 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:30.991 [2024-04-25 23:54:20.343976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.991 [2024-04-25 23:54:20.344002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.991 [2024-04-25 23:54:20.344060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.991 [2024-04-25 23:54:20.344075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.991 [2024-04-25 23:54:20.344132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.991 [2024-04-25 23:54:20.344148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.991 [2024-04-25 23:54:20.344201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.991 [2024-04-25 23:54:20.344215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.991 #46 NEW cov: 11693 ft: 14907 corp: 45/128b lim: 5 exec/s: 23 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:07:30.991 #46 DONE cov: 11693 ft: 14907 corp: 45/128b lim: 5 exec/s: 23 rss: 70Mb 00:07:30.991 ###### Recommended dictionary. ###### 00:07:30.991 "\377\377" # Uses: 1 00:07:30.991 "\377\001\000\000" # Uses: 0 00:07:30.991 "\004\000" # Uses: 0 00:07:30.991 ###### End of recommended dictionary. ###### 00:07:30.991 Done 46 runs in 2 second(s) 00:07:30.991 23:54:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:30.991 23:54:20 -- ../common.sh@72 -- # (( i++ )) 00:07:30.991 23:54:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:30.991 23:54:20 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:30.991 23:54:20 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:30.991 23:54:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:30.991 23:54:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:30.991 23:54:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:30.991 23:54:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:30.991 23:54:20 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:30.991 23:54:20 -- nvmf/run.sh@29 -- # port=4409 00:07:30.991 23:54:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:30.991 23:54:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:30.991 23:54:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:30.991 23:54:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:30.991 [2024-04-25 23:54:20.514863] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:30.991 [2024-04-25 23:54:20.514946] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475608 ] 00:07:30.991 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.250 [2024-04-25 23:54:20.693204] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.250 [2024-04-25 23:54:20.712353] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.250 [2024-04-25 23:54:20.712499] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.250 [2024-04-25 23:54:20.763947] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:31.250 [2024-04-25 23:54:20.780253] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:31.250 INFO: Running with entropic power schedule (0xFF, 100). 00:07:31.250 INFO: Seed: 2106583094 00:07:31.250 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:31.250 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:31.250 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:31.250 INFO: A corpus is not provided, starting from an empty corpus 00:07:31.250 [2024-04-25 23:54:20.825424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.250 [2024-04-25 23:54:20.825453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.250 #2 INITED cov: 11466 ft: 11464 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:31.250 [2024-04-25 23:54:20.855385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.250 [2024-04-25 23:54:20.855415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 #3 NEW cov: 11579 ft: 12052 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:31.509 [2024-04-25 23:54:20.895659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:20.895684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 [2024-04-25 23:54:20.895736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:20.895749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.509 #4 NEW cov: 11585 ft: 12873 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:31.509 [2024-04-25 23:54:20.935735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:20.935759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 [2024-04-25 23:54:20.935814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:20.935827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.509 #5 NEW cov: 11670 ft: 13147 corp: 4/6b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:31.509 [2024-04-25 23:54:20.975896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:20.975921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 [2024-04-25 23:54:20.975974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:20.975988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.509 #6 NEW cov: 11670 ft: 13235 corp: 5/8b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:31.509 [2024-04-25 23:54:21.015879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:21.015904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 #7 NEW cov: 11670 ft: 13280 corp: 6/9b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 EraseBytes- 00:07:31.509 [2024-04-25 23:54:21.056134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:21.056160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 [2024-04-25 23:54:21.056213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:21.056227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.509 #8 NEW cov: 11670 ft: 13340 corp: 7/11b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:31.509 [2024-04-25 23:54:21.096079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.509 [2024-04-25 23:54:21.096104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.509 #9 NEW cov: 11670 ft: 13435 corp: 8/12b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeByte- 00:07:31.768 [2024-04-25 23:54:21.136403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.136429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 [2024-04-25 23:54:21.136484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.136497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.768 #10 NEW cov: 11670 ft: 13506 corp: 9/14b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeByte- 00:07:31.768 [2024-04-25 23:54:21.176322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.176347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 #11 NEW cov: 11670 ft: 13524 corp: 10/15b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 EraseBytes- 00:07:31.768 [2024-04-25 23:54:21.216504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.216529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 #12 NEW cov: 11670 ft: 13536 corp: 11/16b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 CopyPart- 00:07:31.768 [2024-04-25 23:54:21.246691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.246716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 [2024-04-25 23:54:21.246771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.246785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.768 #13 NEW cov: 11670 ft: 13553 corp: 12/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeByte- 00:07:31.768 [2024-04-25 23:54:21.286665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.286689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 #14 NEW cov: 11670 ft: 13585 corp: 13/19b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:31.768 [2024-04-25 23:54:21.326888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.326914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 [2024-04-25 23:54:21.326967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.326981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.768 #15 NEW cov: 11670 ft: 13606 corp: 14/21b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeByte- 00:07:31.768 [2024-04-25 23:54:21.367014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.367042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.768 [2024-04-25 23:54:21.367097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:31.768 [2024-04-25 23:54:21.367111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.027 #16 NEW cov: 11670 ft: 13674 corp: 15/23b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeByte- 00:07:32.027 [2024-04-25 23:54:21.407294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.407319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.027 [2024-04-25 23:54:21.407374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.407387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.027 [2024-04-25 23:54:21.407453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.407468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.027 #17 NEW cov: 11670 ft: 13891 corp: 16/26b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:32.027 [2024-04-25 23:54:21.447317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.447342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.027 [2024-04-25 23:54:21.447400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.447414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.027 #18 NEW cov: 11670 ft: 13970 corp: 17/28b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeByte- 00:07:32.027 [2024-04-25 23:54:21.487230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.487255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.027 #19 NEW cov: 11670 ft: 13985 corp: 18/29b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeByte- 00:07:32.027 [2024-04-25 23:54:21.527360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.527385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.027 #20 NEW cov: 11670 ft: 14042 corp: 19/30b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeByte- 00:07:32.027 [2024-04-25 23:54:21.567443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.567473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.027 #21 NEW cov: 11670 ft: 14085 corp: 20/31b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 ChangeByte- 00:07:32.027 [2024-04-25 23:54:21.607720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.607745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.027 [2024-04-25 23:54:21.607800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.607815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.027 #22 NEW cov: 11670 ft: 14093 corp: 21/33b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:32.027 [2024-04-25 23:54:21.637815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.027 [2024-04-25 23:54:21.637839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.286 [2024-04-25 23:54:21.637894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.286 [2024-04-25 23:54:21.637909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.286 #23 NEW cov: 11670 ft: 14105 corp: 22/35b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeBit- 00:07:32.286 [2024-04-25 23:54:21.677911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.286 [2024-04-25 23:54:21.677935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.286 [2024-04-25 23:54:21.677990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.286 [2024-04-25 23:54:21.678003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.286 #24 NEW cov: 11670 ft: 14114 corp: 23/37b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeByte- 00:07:32.286 [2024-04-25 23:54:21.718221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.286 [2024-04-25 23:54:21.718246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.286 [2024-04-25 23:54:21.718302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.286 [2024-04-25 23:54:21.718316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.286 [2024-04-25 23:54:21.718370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.286 [2024-04-25 23:54:21.718384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.545 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:32.545 #25 NEW cov: 11693 ft: 14158 corp: 24/40b lim: 5 exec/s: 25 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:07:32.545 [2024-04-25 23:54:22.018888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.018926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.545 [2024-04-25 23:54:22.018987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.019003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.545 #26 NEW cov: 11693 ft: 14185 corp: 25/42b lim: 5 exec/s: 26 rss: 69Mb L: 2/3 MS: 1 ChangeBinInt- 00:07:32.545 [2024-04-25 23:54:22.068962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.068987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.545 [2024-04-25 23:54:22.069043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.069057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.545 #27 NEW cov: 11693 ft: 14230 corp: 26/44b lim: 5 exec/s: 27 rss: 69Mb L: 2/3 MS: 1 InsertByte- 00:07:32.545 [2024-04-25 23:54:22.109100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.109124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.545 [2024-04-25 23:54:22.109179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.109192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.545 [2024-04-25 23:54:22.139202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.139228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.545 [2024-04-25 23:54:22.139281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.545 [2024-04-25 23:54:22.139295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 #29 NEW cov: 11693 ft: 14239 corp: 27/46b lim: 5 exec/s: 29 rss: 69Mb L: 2/3 MS: 2 ChangeByte-CrossOver- 00:07:32.804 [2024-04-25 23:54:22.179325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.179349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.179411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.179425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 #30 NEW cov: 11693 ft: 14267 corp: 28/48b lim: 5 exec/s: 30 rss: 70Mb L: 2/3 MS: 1 InsertByte- 00:07:32.804 [2024-04-25 23:54:22.219734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.219758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.219817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.219831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.219882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.219896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.219949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.219962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.804 #31 NEW cov: 11693 ft: 14544 corp: 29/52b lim: 5 exec/s: 31 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:32.804 [2024-04-25 23:54:22.259540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.259565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.259620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.259634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 #32 NEW cov: 11693 ft: 14583 corp: 30/54b lim: 5 exec/s: 32 rss: 70Mb L: 2/4 MS: 1 CopyPart- 00:07:32.804 [2024-04-25 23:54:22.299760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.299784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.299838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.299851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.299904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.299917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.804 #33 NEW cov: 11693 ft: 14589 corp: 31/57b lim: 5 exec/s: 33 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:07:32.804 [2024-04-25 23:54:22.339741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.339765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.339819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.339832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 #34 NEW cov: 11693 ft: 14632 corp: 32/59b lim: 5 exec/s: 34 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:32.804 [2024-04-25 23:54:22.379855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.379884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.804 [2024-04-25 23:54:22.379940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.804 [2024-04-25 23:54:22.379954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.804 #35 NEW cov: 11693 ft: 14666 corp: 33/61b lim: 5 exec/s: 35 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:33.063 [2024-04-25 23:54:22.420171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.420197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.420252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.420266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.420321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.420335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.063 #36 NEW cov: 11693 ft: 14676 corp: 34/64b lim: 5 exec/s: 36 rss: 70Mb L: 3/4 MS: 1 InsertByte- 00:07:33.063 [2024-04-25 23:54:22.460106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.460131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.460182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.460195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.063 #37 NEW cov: 11693 ft: 14684 corp: 35/66b lim: 5 exec/s: 37 rss: 70Mb L: 2/4 MS: 1 ChangeBit- 00:07:33.063 [2024-04-25 23:54:22.500079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.500104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.063 #38 NEW cov: 11693 ft: 14689 corp: 36/67b lim: 5 exec/s: 38 rss: 70Mb L: 1/4 MS: 1 EraseBytes- 00:07:33.063 [2024-04-25 23:54:22.540537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.540563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.540621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.540635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.540689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.540703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.063 #39 NEW cov: 11693 ft: 14696 corp: 37/70b lim: 5 exec/s: 39 rss: 70Mb L: 3/4 MS: 1 CrossOver- 00:07:33.063 [2024-04-25 23:54:22.580612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.580639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.580694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.580708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.063 [2024-04-25 23:54:22.580763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.580777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.063 #40 NEW cov: 11693 ft: 14698 corp: 38/73b lim: 5 exec/s: 40 rss: 70Mb L: 3/4 MS: 1 ChangeBit- 00:07:33.063 [2024-04-25 23:54:22.620592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.063 [2024-04-25 23:54:22.620617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.064 [2024-04-25 23:54:22.620671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.064 [2024-04-25 23:54:22.620685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.064 #41 NEW cov: 11693 ft: 14713 corp: 39/75b lim: 5 exec/s: 41 rss: 70Mb L: 2/4 MS: 1 CopyPart- 00:07:33.064 [2024-04-25 23:54:22.650824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.064 [2024-04-25 23:54:22.650848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.064 [2024-04-25 23:54:22.650901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.064 [2024-04-25 23:54:22.650915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.064 [2024-04-25 23:54:22.650965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.064 [2024-04-25 23:54:22.650980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.064 #42 NEW cov: 11693 ft: 14771 corp: 40/78b lim: 5 exec/s: 42 rss: 70Mb L: 3/4 MS: 1 CrossOver- 00:07:33.321 [2024-04-25 23:54:22.690775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.690800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.690854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.690868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 #43 NEW cov: 11693 ft: 14774 corp: 41/80b lim: 5 exec/s: 43 rss: 70Mb L: 2/4 MS: 1 ChangeBinInt- 00:07:33.321 [2024-04-25 23:54:22.731228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.731256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.731310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.731324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.731376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.731390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.731446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.731459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.321 #44 NEW cov: 11693 ft: 14782 corp: 42/84b lim: 5 exec/s: 44 rss: 70Mb L: 4/4 MS: 1 CrossOver- 00:07:33.321 [2024-04-25 23:54:22.770997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.771022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.771075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.771089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 #45 NEW cov: 11693 ft: 14794 corp: 43/86b lim: 5 exec/s: 45 rss: 70Mb L: 2/4 MS: 1 ShuffleBytes- 00:07:33.321 [2024-04-25 23:54:22.811260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.811285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.811338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.811351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 [2024-04-25 23:54:22.811407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-04-25 23:54:22.811421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.321 #46 NEW cov: 11693 ft: 14816 corp: 44/89b lim: 5 exec/s: 23 rss: 70Mb L: 3/4 MS: 1 ChangeBit- 00:07:33.321 #46 DONE cov: 11693 ft: 14816 corp: 44/89b lim: 5 exec/s: 23 rss: 70Mb 00:07:33.321 Done 46 runs in 2 second(s) 00:07:33.580 23:54:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:33.580 23:54:22 -- ../common.sh@72 -- # (( i++ )) 00:07:33.580 23:54:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.580 23:54:22 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:33.580 23:54:22 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:33.580 23:54:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.580 23:54:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.580 23:54:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:33.580 23:54:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:33.580 23:54:22 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:33.580 23:54:22 -- nvmf/run.sh@29 -- # port=4410 00:07:33.580 23:54:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:33.580 23:54:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:33.580 23:54:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.580 23:54:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:33.580 [2024-04-25 23:54:22.993961] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:33.580 [2024-04-25 23:54:22.994055] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476092 ] 00:07:33.580 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.580 [2024-04-25 23:54:23.171104] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.580 [2024-04-25 23:54:23.190479] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.580 [2024-04-25 23:54:23.190611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.839 [2024-04-25 23:54:23.242136] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.839 [2024-04-25 23:54:23.258419] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:33.839 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.839 INFO: Seed: 289632884 00:07:33.839 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:33.839 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:33.839 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:33.839 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.839 #2 INITED exec/s: 0 rss: 59Mb 00:07:33.839 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.839 This may also happen if the target rejected all inputs we tried so far 00:07:33.839 [2024-04-25 23:54:23.303970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.839 [2024-04-25 23:54:23.303997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.839 [2024-04-25 23:54:23.304055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.839 [2024-04-25 23:54:23.304068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.839 [2024-04-25 23:54:23.304124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.839 [2024-04-25 23:54:23.304138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.839 [2024-04-25 23:54:23.304192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.839 [2024-04-25 23:54:23.304205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.097 NEW_FUNC[1/660]: 0x4aa3c0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:34.097 NEW_FUNC[2/660]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.097 #6 NEW cov: 11465 ft: 11464 corp: 2/36b lim: 40 exec/s: 0 rss: 67Mb L: 35/35 MS: 4 ShuffleBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:34.097 [2024-04-25 23:54:23.604385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.097 [2024-04-25 23:54:23.604419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.097 NEW_FUNC[1/3]: 0x16dff00 in nvme_complete_register_operations /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:726 00:07:34.097 NEW_FUNC[2/3]: 0x16f3060 in nvme_robust_mutex_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1100 00:07:34.097 #8 NEW cov: 11602 ft: 12621 corp: 3/48b lim: 40 exec/s: 0 rss: 67Mb L: 12/35 MS: 2 ChangeBinInt-CrossOver- 00:07:34.097 [2024-04-25 23:54:23.644432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e70a cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.097 [2024-04-25 23:54:23.644457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.097 #9 NEW cov: 11608 ft: 12958 corp: 4/61b lim: 40 exec/s: 0 rss: 67Mb L: 13/35 MS: 1 CrossOver- 00:07:34.097 [2024-04-25 23:54:23.684828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.097 [2024-04-25 23:54:23.684853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.097 [2024-04-25 23:54:23.684917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.097 [2024-04-25 23:54:23.684932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.097 [2024-04-25 23:54:23.684992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.097 [2024-04-25 23:54:23.685006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.097 #19 NEW cov: 11693 ft: 13409 corp: 5/87b lim: 40 exec/s: 0 rss: 67Mb L: 26/35 MS: 5 ChangeByte-ChangeBit-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:34.355 [2024-04-25 23:54:23.725188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.725213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.725275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.725289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.725347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.725361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.725427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.725441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.725501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e70a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.725518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.355 #25 NEW cov: 11693 ft: 13556 corp: 6/127b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 1 CrossOver- 00:07:34.355 [2024-04-25 23:54:23.764913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e70a cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.764939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.765003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e70ae7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.765017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.355 #26 NEW cov: 11693 ft: 13878 corp: 7/150b lim: 40 exec/s: 0 rss: 67Mb L: 23/40 MS: 1 CopyPart- 00:07:34.355 [2024-04-25 23:54:23.804896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e70a cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.804921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.355 #27 NEW cov: 11693 ft: 13986 corp: 8/160b lim: 40 exec/s: 0 rss: 67Mb L: 10/40 MS: 1 EraseBytes- 00:07:34.355 [2024-04-25 23:54:23.845313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.845339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.845403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.845416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.845477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.845491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.355 #28 NEW cov: 11693 ft: 14038 corp: 9/186b lim: 40 exec/s: 0 rss: 68Mb L: 26/40 MS: 1 ShuffleBytes- 00:07:34.355 [2024-04-25 23:54:23.885158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.885183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.355 #29 NEW cov: 11693 ft: 14075 corp: 10/199b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 InsertByte- 00:07:34.355 [2024-04-25 23:54:23.925507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.925531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.355 [2024-04-25 23:54:23.925593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ffff7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.355 [2024-04-25 23:54:23.925608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.356 [2024-04-25 23:54:23.925668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:10c8170c cdw11:89000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.356 [2024-04-25 23:54:23.925683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.356 #30 NEW cov: 11693 ft: 14111 corp: 11/225b lim: 40 exec/s: 0 rss: 68Mb L: 26/40 MS: 1 CMP- DE: "\377\377~\020\310\027\014\211"- 00:07:34.356 [2024-04-25 23:54:23.965636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.356 [2024-04-25 23:54:23.965661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.356 [2024-04-25 23:54:23.965727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.356 [2024-04-25 23:54:23.965742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.356 [2024-04-25 23:54:23.965806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.356 [2024-04-25 23:54:23.965820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.615 #31 NEW cov: 11693 ft: 14153 corp: 12/251b lim: 40 exec/s: 0 rss: 68Mb L: 26/40 MS: 1 CopyPart- 00:07:34.615 [2024-04-25 23:54:24.005918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e70a cdw11:e7000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.005942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.006005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.006020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.006080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.006095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.006156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.006169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.615 #32 NEW cov: 11693 ft: 14165 corp: 13/285b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:07:34.615 [2024-04-25 23:54:24.045844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.045869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.045934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.045948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.046011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.046025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.615 #33 NEW cov: 11693 ft: 14207 corp: 14/311b lim: 40 exec/s: 0 rss: 68Mb L: 26/40 MS: 1 ShuffleBytes- 00:07:34.615 [2024-04-25 23:54:24.086135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.086164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.086227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.086243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.086304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.086319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.086378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:7fe7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.086398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.615 #34 NEW cov: 11693 ft: 14234 corp: 15/346b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 ChangeByte- 00:07:34.615 [2024-04-25 23:54:24.126074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:80000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.126100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.126164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.126178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.126243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.126257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.615 #35 NEW cov: 11693 ft: 14316 corp: 16/372b lim: 40 exec/s: 0 rss: 68Mb L: 26/40 MS: 1 ChangeBit- 00:07:34.615 [2024-04-25 23:54:24.165926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7f7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.165952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.615 #36 NEW cov: 11693 ft: 14386 corp: 17/385b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 ChangeBit- 00:07:34.615 [2024-04-25 23:54:24.206219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a01 cdw11:e7e70ae7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.206245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.615 [2024-04-25 23:54:24.206309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.615 [2024-04-25 23:54:24.206323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.874 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.874 #41 NEW cov: 11716 ft: 14425 corp: 18/401b lim: 40 exec/s: 0 rss: 68Mb L: 16/40 MS: 5 CopyPart-ShuffleBytes-ShuffleBytes-CopyPart-CrossOver- 00:07:34.874 [2024-04-25 23:54:24.246173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.246202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.874 #42 NEW cov: 11716 ft: 14447 corp: 19/414b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 InsertByte- 00:07:34.874 [2024-04-25 23:54:24.286359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0aff76 cdw11:155b19da SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.286405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.874 [2024-04-25 23:54:24.286469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b3b2e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.286484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.874 #43 NEW cov: 11716 ft: 14451 corp: 20/430b lim: 40 exec/s: 43 rss: 68Mb L: 16/40 MS: 1 CMP- DE: "\377v\025[\031\332\263\262"- 00:07:34.874 [2024-04-25 23:54:24.326660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ff7e10c8 cdw11:170c8900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.326685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.874 [2024-04-25 23:54:24.326749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00002674 cdw11:00ffff7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.326764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.874 [2024-04-25 23:54:24.326827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:10c8170c cdw11:89000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.326842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.874 #44 NEW cov: 11716 ft: 14457 corp: 21/456b lim: 40 exec/s: 44 rss: 68Mb L: 26/40 MS: 1 CopyPart- 00:07:34.874 [2024-04-25 23:54:24.366654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.366680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.874 [2024-04-25 23:54:24.366743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e700 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.366757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.874 #45 NEW cov: 11716 ft: 14503 corp: 22/477b lim: 40 exec/s: 45 rss: 69Mb L: 21/40 MS: 1 InsertRepeatedBytes- 00:07:34.874 [2024-04-25 23:54:24.406622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.406648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.874 #46 NEW cov: 11716 ft: 14518 corp: 23/490b lim: 40 exec/s: 46 rss: 69Mb L: 13/40 MS: 1 ShuffleBytes- 00:07:34.874 [2024-04-25 23:54:24.446906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.446932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.874 [2024-04-25 23:54:24.446996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7ff cdw11:ff7e10c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.874 [2024-04-25 23:54:24.447014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.874 #47 NEW cov: 11716 ft: 14550 corp: 24/511b lim: 40 exec/s: 47 rss: 69Mb L: 21/40 MS: 1 PersAutoDict- DE: "\377\377~\020\310\027\014\211"- 00:07:35.131 [2024-04-25 23:54:24.486873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7f7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-04-25 23:54:24.486899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 #48 NEW cov: 11716 ft: 14556 corp: 25/519b lim: 40 exec/s: 48 rss: 69Mb L: 8/40 MS: 1 EraseBytes- 00:07:35.131 [2024-04-25 23:54:24.527017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e721e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-04-25 23:54:24.527042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 #49 NEW cov: 11716 ft: 14568 corp: 26/532b lim: 40 exec/s: 49 rss: 69Mb L: 13/40 MS: 1 ChangeByte- 00:07:35.131 [2024-04-25 23:54:24.567386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e70a cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-04-25 23:54:24.567416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 [2024-04-25 23:54:24.567482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e70ae7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-04-25 23:54:24.567496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.131 [2024-04-25 23:54:24.567564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e732 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-04-25 23:54:24.567578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.131 #50 NEW cov: 11716 ft: 14588 corp: 27/556b lim: 40 exec/s: 50 rss: 69Mb L: 24/40 MS: 1 InsertByte- 00:07:35.132 [2024-04-25 23:54:24.607266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2be7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.132 [2024-04-25 23:54:24.607292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.132 #51 NEW cov: 11716 ft: 14596 corp: 28/569b lim: 40 exec/s: 51 rss: 69Mb L: 13/40 MS: 1 ChangeByte- 00:07:35.132 [2024-04-25 23:54:24.647347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7f7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.132 [2024-04-25 23:54:24.647372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.132 #52 NEW cov: 11716 ft: 14604 corp: 29/582b lim: 40 exec/s: 52 rss: 69Mb L: 13/40 MS: 1 ShuffleBytes- 00:07:35.132 [2024-04-25 23:54:24.687442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7ff cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.132 [2024-04-25 23:54:24.687466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.132 #53 NEW cov: 11716 ft: 14613 corp: 30/590b lim: 40 exec/s: 53 rss: 69Mb L: 8/40 MS: 1 ChangeBit- 00:07:35.132 [2024-04-25 23:54:24.727577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.132 [2024-04-25 23:54:24.727601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 #54 NEW cov: 11716 ft: 14628 corp: 31/603b lim: 40 exec/s: 54 rss: 69Mb L: 13/40 MS: 1 CopyPart- 00:07:35.390 [2024-04-25 23:54:24.768119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.768144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.768207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.768222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.768283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:e7e701e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.768298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.768361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e70ae700 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.768376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.390 #55 NEW cov: 11716 ft: 14699 corp: 32/638b lim: 40 exec/s: 55 rss: 69Mb L: 35/40 MS: 1 CrossOver- 00:07:35.390 [2024-04-25 23:54:24.808149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e70a cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.808174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.808238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7e7 cdw11:e7e70ae7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.808252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.808316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:32e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.808330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.390 #56 NEW cov: 11716 ft: 14717 corp: 33/663b lim: 40 exec/s: 56 rss: 69Mb L: 25/40 MS: 1 CrossOver- 00:07:35.390 [2024-04-25 23:54:24.848070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.848095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.848158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e7e7e7ff cdw11:ff7f10c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.848172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.390 #57 NEW cov: 11716 ft: 14744 corp: 34/684b lim: 40 exec/s: 57 rss: 70Mb L: 21/40 MS: 1 ChangeBit- 00:07:35.390 [2024-04-25 23:54:24.888060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0140e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.888084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 #58 NEW cov: 11716 ft: 14747 corp: 35/697b lim: 40 exec/s: 58 rss: 70Mb L: 13/40 MS: 1 ChangeByte- 00:07:35.390 [2024-04-25 23:54:24.918440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:7e10c817 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.918467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.918531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0c890000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.918545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.918607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.918621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.390 #59 NEW cov: 11716 ft: 14755 corp: 36/723b lim: 40 exec/s: 59 rss: 70Mb L: 26/40 MS: 1 PersAutoDict- DE: "\377\377~\020\310\027\014\211"- 00:07:35.390 [2024-04-25 23:54:24.958311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:01e7e70a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.958336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 #60 NEW cov: 11716 ft: 14763 corp: 37/733b lim: 40 exec/s: 60 rss: 70Mb L: 10/40 MS: 1 CrossOver- 00:07:35.390 [2024-04-25 23:54:24.998819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e7ffff7e cdw11:10c8170c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.998844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.998906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:89e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.998921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.998982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.998997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.390 [2024-04-25 23:54:24.999061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:7fe7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.390 [2024-04-25 23:54:24.999075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.649 #61 NEW cov: 11716 ft: 14780 corp: 38/768b lim: 40 exec/s: 61 rss: 70Mb L: 35/40 MS: 1 PersAutoDict- DE: "\377\377~\020\310\027\014\211"- 00:07:35.649 [2024-04-25 23:54:25.038609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e73a cdw11:ed9dff5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.038634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.649 [2024-04-25 23:54:25.038697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:157700f7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.038712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.649 #62 NEW cov: 11716 ft: 14785 corp: 39/789b lim: 40 exec/s: 62 rss: 70Mb L: 21/40 MS: 1 CMP- DE: ":\355\235\377Z\025w\000"- 00:07:35.649 [2024-04-25 23:54:25.078899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.078925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.649 [2024-04-25 23:54:25.078987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.079002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.649 [2024-04-25 23:54:25.079066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.079080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.649 #63 NEW cov: 11716 ft: 14786 corp: 40/815b lim: 40 exec/s: 63 rss: 70Mb L: 26/40 MS: 1 ChangeBinInt- 00:07:35.649 [2024-04-25 23:54:25.118713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e5 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.118737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.649 #64 NEW cov: 11716 ft: 14852 corp: 41/828b lim: 40 exec/s: 64 rss: 70Mb L: 13/40 MS: 1 ChangeBit- 00:07:35.649 [2024-04-25 23:54:25.158968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0aff cdw11:76155b19 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.158993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.649 [2024-04-25 23:54:25.159057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dab3b2e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.159072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.649 #65 NEW cov: 11716 ft: 14857 corp: 42/844b lim: 40 exec/s: 65 rss: 70Mb L: 16/40 MS: 1 PersAutoDict- DE: "\377v\025[\031\332\263\262"- 00:07:35.649 [2024-04-25 23:54:25.198893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.198917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.649 #66 NEW cov: 11716 ft: 14872 corp: 43/853b lim: 40 exec/s: 66 rss: 70Mb L: 9/40 MS: 1 EraseBytes- 00:07:35.649 [2024-04-25 23:54:25.239028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01e7e7f7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.649 [2024-04-25 23:54:25.239052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.649 #67 NEW cov: 11716 ft: 14888 corp: 44/866b lim: 40 exec/s: 67 rss: 70Mb L: 13/40 MS: 1 ChangeByte- 00:07:35.908 [2024-04-25 23:54:25.279530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e7ffff7e cdw11:10c8170c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.908 [2024-04-25 23:54:25.279555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.908 [2024-04-25 23:54:25.279618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:01e7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.908 [2024-04-25 23:54:25.279632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.908 [2024-04-25 23:54:25.279709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:77e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.908 [2024-04-25 23:54:25.279727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.908 [2024-04-25 23:54:25.279789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:7fe7e7e7 cdw11:e7e7e7e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.908 [2024-04-25 23:54:25.279803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.908 #68 NEW cov: 11716 ft: 14979 corp: 45/901b lim: 40 exec/s: 34 rss: 70Mb L: 35/40 MS: 1 CrossOver- 00:07:35.908 #68 DONE cov: 11716 ft: 14979 corp: 45/901b lim: 40 exec/s: 34 rss: 70Mb 00:07:35.908 ###### Recommended dictionary. ###### 00:07:35.908 "\377\377~\020\310\027\014\211" # Uses: 3 00:07:35.908 "\377v\025[\031\332\263\262" # Uses: 1 00:07:35.908 ":\355\235\377Z\025w\000" # Uses: 0 00:07:35.908 ###### End of recommended dictionary. ###### 00:07:35.908 Done 68 runs in 2 second(s) 00:07:35.908 23:54:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:35.908 23:54:25 -- ../common.sh@72 -- # (( i++ )) 00:07:35.908 23:54:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.908 23:54:25 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:35.908 23:54:25 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:35.908 23:54:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.908 23:54:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.908 23:54:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:35.908 23:54:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:35.908 23:54:25 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:35.908 23:54:25 -- nvmf/run.sh@29 -- # port=4411 00:07:35.908 23:54:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:35.908 23:54:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:35.908 23:54:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.908 23:54:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:35.908 [2024-04-25 23:54:25.461357] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:35.908 [2024-04-25 23:54:25.461455] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476437 ] 00:07:35.908 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.166 [2024-04-25 23:54:25.644985] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.166 [2024-04-25 23:54:25.664287] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.166 [2024-04-25 23:54:25.664416] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.166 [2024-04-25 23:54:25.716075] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:36.166 [2024-04-25 23:54:25.732367] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:36.166 INFO: Running with entropic power schedule (0xFF, 100). 00:07:36.166 INFO: Seed: 2765622731 00:07:36.166 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:36.166 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:36.166 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:36.166 INFO: A corpus is not provided, starting from an empty corpus 00:07:36.166 #2 INITED exec/s: 0 rss: 59Mb 00:07:36.166 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:36.166 This may also happen if the target rejected all inputs we tried so far 00:07:36.424 [2024-04-25 23:54:25.809294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.424 [2024-04-25 23:54:25.809336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.424 [2024-04-25 23:54:25.809465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.424 [2024-04-25 23:54:25.809482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.424 [2024-04-25 23:54:25.809614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.424 [2024-04-25 23:54:25.809630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.424 [2024-04-25 23:54:25.809752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.424 [2024-04-25 23:54:25.809769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.683 NEW_FUNC[1/664]: 0x4ac130 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:36.683 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.683 #5 NEW cov: 11501 ft: 11502 corp: 2/37b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 3 InsertByte-CopyPart-InsertRepeatedBytes- 00:07:36.683 [2024-04-25 23:54:26.149693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.683 [2024-04-25 23:54:26.149740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.683 [2024-04-25 23:54:26.149880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.683 [2024-04-25 23:54:26.149902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.683 #8 NEW cov: 11614 ft: 12430 corp: 3/59b lim: 40 exec/s: 0 rss: 67Mb L: 22/36 MS: 3 ChangeBit-ChangeBit-CrossOver- 00:07:36.683 [2024-04-25 23:54:26.189999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.683 [2024-04-25 23:54:26.190026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.683 [2024-04-25 23:54:26.190156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.684 [2024-04-25 23:54:26.190173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.684 [2024-04-25 23:54:26.190296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.684 [2024-04-25 23:54:26.190313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.684 #10 NEW cov: 11620 ft: 12812 corp: 4/88b lim: 40 exec/s: 0 rss: 67Mb L: 29/36 MS: 2 CopyPart-CrossOver- 00:07:36.684 [2024-04-25 23:54:26.229845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.684 [2024-04-25 23:54:26.229871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.684 [2024-04-25 23:54:26.229985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.684 [2024-04-25 23:54:26.230005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.684 #11 NEW cov: 11705 ft: 13044 corp: 5/110b lim: 40 exec/s: 0 rss: 67Mb L: 22/36 MS: 1 ShuffleBytes- 00:07:36.684 [2024-04-25 23:54:26.269972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.684 [2024-04-25 23:54:26.269998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.684 [2024-04-25 23:54:26.270123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.684 [2024-04-25 23:54:26.270138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.943 #12 NEW cov: 11705 ft: 13125 corp: 6/132b lim: 40 exec/s: 0 rss: 67Mb L: 22/36 MS: 1 ChangeBinInt- 00:07:36.944 [2024-04-25 23:54:26.310667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.310692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.310810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.310828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.310953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.310968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.311087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.311103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.944 #13 NEW cov: 11705 ft: 13188 corp: 7/168b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:36.944 [2024-04-25 23:54:26.350739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.350765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.350885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.350902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.351027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.351042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.351163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:23ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.351179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.944 #14 NEW cov: 11705 ft: 13256 corp: 8/204b lim: 40 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 ChangeByte- 00:07:36.944 [2024-04-25 23:54:26.391199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.391225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.391344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff9b9b9b cdw11:9bffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.391361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.391496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.391511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.391635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:23ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.391650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.391771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff2c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.391786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.944 #15 NEW cov: 11705 ft: 13397 corp: 9/244b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:36.944 [2024-04-25 23:54:26.441108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.441135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.441274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.441291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.441399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.441413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.441558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.441573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.944 #16 NEW cov: 11705 ft: 13486 corp: 10/282b lim: 40 exec/s: 0 rss: 68Mb L: 38/40 MS: 1 CopyPart- 00:07:36.944 [2024-04-25 23:54:26.481211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.481240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.481375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff1010 cdw11:10101010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.481391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.481527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.481547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.481644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.481662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.944 #17 NEW cov: 11705 ft: 13609 corp: 11/317b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:07:36.944 [2024-04-25 23:54:26.531323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.531350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.531471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.531487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.531602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.531618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.944 [2024-04-25 23:54:26.531741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.944 [2024-04-25 23:54:26.531757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.204 #18 NEW cov: 11705 ft: 13696 corp: 12/353b lim: 40 exec/s: 0 rss: 68Mb L: 36/40 MS: 1 EraseBytes- 00:07:37.204 [2024-04-25 23:54:26.571400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.571425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.571544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.571561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.571685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.571701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.571827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000ff09 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.571844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.204 #19 NEW cov: 11705 ft: 13716 corp: 13/386b lim: 40 exec/s: 0 rss: 68Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:07:37.204 [2024-04-25 23:54:26.611597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.611624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.611749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.611768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.611892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fffffffd cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.611907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.612022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.612037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.204 #20 NEW cov: 11705 ft: 13773 corp: 14/422b lim: 40 exec/s: 0 rss: 68Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:37.204 [2024-04-25 23:54:26.651711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.651738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.651855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.651870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.651991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.652008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.652133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.652150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.204 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:37.204 #21 NEW cov: 11728 ft: 13857 corp: 15/460b lim: 40 exec/s: 0 rss: 68Mb L: 38/40 MS: 1 CrossOver- 00:07:37.204 [2024-04-25 23:54:26.691009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.691035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.204 #22 NEW cov: 11728 ft: 14658 corp: 16/475b lim: 40 exec/s: 0 rss: 68Mb L: 15/40 MS: 1 CrossOver- 00:07:37.204 [2024-04-25 23:54:26.731974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.732001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.732123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.732138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.732261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:40ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.732277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.204 [2024-04-25 23:54:26.732399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffff23 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.204 [2024-04-25 23:54:26.732414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.205 #23 NEW cov: 11728 ft: 14669 corp: 17/514b lim: 40 exec/s: 0 rss: 68Mb L: 39/40 MS: 1 InsertByte- 00:07:37.205 [2024-04-25 23:54:26.771558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e0affff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.205 [2024-04-25 23:54:26.771584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.205 [2024-04-25 23:54:26.771735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000ff09 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.205 [2024-04-25 23:54:26.771750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.205 #24 NEW cov: 11728 ft: 14677 corp: 18/531b lim: 40 exec/s: 24 rss: 68Mb L: 17/40 MS: 1 EraseBytes- 00:07:37.464 [2024-04-25 23:54:26.822038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:21312e0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.822066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.822193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.822209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.822345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.822362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.464 #28 NEW cov: 11728 ft: 14681 corp: 19/555b lim: 40 exec/s: 28 rss: 68Mb L: 24/40 MS: 4 ChangeByte-InsertByte-ChangeByte-CrossOver- 00:07:37.464 [2024-04-25 23:54:26.861902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.861930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.862082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.862100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.862234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.862253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.862383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:23ffff00 cdw11:24ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.862403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.464 #29 NEW cov: 11728 ft: 14741 corp: 20/591b lim: 40 exec/s: 29 rss: 68Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:37.464 [2024-04-25 23:54:26.902294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.902323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.902472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff2e cdw11:0affffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.902489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.902622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.902639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.902765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.902781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.464 #30 NEW cov: 11728 ft: 14772 corp: 21/624b lim: 40 exec/s: 30 rss: 68Mb L: 33/40 MS: 1 CrossOver- 00:07:37.464 [2024-04-25 23:54:26.941957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.941983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.942125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.942141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.942282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.942298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.464 #31 NEW cov: 11728 ft: 14795 corp: 22/653b lim: 40 exec/s: 31 rss: 68Mb L: 29/40 MS: 1 ChangeBit- 00:07:37.464 [2024-04-25 23:54:26.982189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.982229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.982391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.982413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:26.982547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:26.982563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.464 #32 NEW cov: 11728 ft: 14832 corp: 23/679b lim: 40 exec/s: 32 rss: 68Mb L: 26/40 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:37.464 [2024-04-25 23:54:27.022528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a02ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:27.022556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:27.022673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:27.022693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:27.022819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:27.022835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.464 #33 NEW cov: 11728 ft: 14843 corp: 24/708b lim: 40 exec/s: 33 rss: 68Mb L: 29/40 MS: 1 ChangeBit- 00:07:37.464 [2024-04-25 23:54:27.062888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.464 [2024-04-25 23:54:27.062913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.464 [2024-04-25 23:54:27.063057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.465 [2024-04-25 23:54:27.063073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.465 [2024-04-25 23:54:27.063196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.465 [2024-04-25 23:54:27.063212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.465 [2024-04-25 23:54:27.063334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.465 [2024-04-25 23:54:27.063350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.724 #34 NEW cov: 11728 ft: 14855 corp: 25/743b lim: 40 exec/s: 34 rss: 69Mb L: 35/40 MS: 1 CopyPart- 00:07:37.724 [2024-04-25 23:54:27.102795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a02ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.102820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.102944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:c2ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.102961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.103091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.103106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.724 #35 NEW cov: 11728 ft: 14863 corp: 26/773b lim: 40 exec/s: 35 rss: 69Mb L: 30/40 MS: 1 InsertByte- 00:07:37.724 [2024-04-25 23:54:27.143046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.143073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.143205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.143222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.143347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.143368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.143503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.143520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.143667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff2c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.143683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.724 #36 NEW cov: 11728 ft: 14905 corp: 27/813b lim: 40 exec/s: 36 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:37.724 [2024-04-25 23:54:27.183048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.183074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.183207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.183223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.183344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff2fffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.183360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.724 #37 NEW cov: 11728 ft: 14909 corp: 28/843b lim: 40 exec/s: 37 rss: 69Mb L: 30/40 MS: 1 InsertByte- 00:07:37.724 [2024-04-25 23:54:27.223752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.223780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.223897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff9b9b9b cdw11:9bffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.223913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.224048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.224065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.224197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.224215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.224344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff23ffff cdw11:ffff2c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.224361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.724 #38 NEW cov: 11728 ft: 14964 corp: 29/883b lim: 40 exec/s: 38 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:37.724 [2024-04-25 23:54:27.273401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.273430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.273558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.273575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.273694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.273711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.273836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0024ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.273851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.724 #39 NEW cov: 11728 ft: 14975 corp: 30/919b lim: 40 exec/s: 39 rss: 69Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:37.724 [2024-04-25 23:54:27.313088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a02ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.313114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.724 [2024-04-25 23:54:27.313228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.724 [2024-04-25 23:54:27.313244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.984 #40 NEW cov: 11728 ft: 14986 corp: 31/935b lim: 40 exec/s: 40 rss: 69Mb L: 16/40 MS: 1 EraseBytes- 00:07:37.984 [2024-04-25 23:54:27.353495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a02ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.353523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.353645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:c2ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.353661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.353777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.353793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.984 #41 NEW cov: 11728 ft: 15003 corp: 32/965b lim: 40 exec/s: 41 rss: 69Mb L: 30/40 MS: 1 ChangeBinInt- 00:07:37.984 [2024-04-25 23:54:27.393450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a02ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.393476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.393605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.393623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.984 #42 NEW cov: 11728 ft: 15013 corp: 33/985b lim: 40 exec/s: 42 rss: 69Mb L: 20/40 MS: 1 EraseBytes- 00:07:37.984 [2024-04-25 23:54:27.434074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.434100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.434238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.434253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.434381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.434399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.434523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0024ff24 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.434539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.984 #43 NEW cov: 11728 ft: 15038 corp: 34/1021b lim: 40 exec/s: 43 rss: 69Mb L: 36/40 MS: 1 ChangeByte- 00:07:37.984 [2024-04-25 23:54:27.474197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.474222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.474358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.474375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.474503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00faffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.474519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.474644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:23ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.474659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.984 #44 NEW cov: 11728 ft: 15070 corp: 35/1057b lim: 40 exec/s: 44 rss: 69Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:37.984 [2024-04-25 23:54:27.514512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.514539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.514662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.514681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.984 [2024-04-25 23:54:27.514809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.984 [2024-04-25 23:54:27.514825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.985 [2024-04-25 23:54:27.514955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff23ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.985 [2024-04-25 23:54:27.514972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.985 [2024-04-25 23:54:27.515092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff2c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.985 [2024-04-25 23:54:27.515108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.985 #45 NEW cov: 11728 ft: 15089 corp: 36/1097b lim: 40 exec/s: 45 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:37.985 [2024-04-25 23:54:27.564204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aa40aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.985 [2024-04-25 23:54:27.564232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.985 [2024-04-25 23:54:27.564355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff1010 cdw11:10101010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.985 [2024-04-25 23:54:27.564372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.985 [2024-04-25 23:54:27.564511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.985 [2024-04-25 23:54:27.564527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.985 #50 NEW cov: 11728 ft: 15090 corp: 37/1124b lim: 40 exec/s: 50 rss: 69Mb L: 27/40 MS: 5 ShuffleBytes-InsertByte-CopyPart-CopyPart-CrossOver- 00:07:38.244 [2024-04-25 23:54:27.604073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a02ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.604101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.604235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.604252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.244 #51 NEW cov: 11728 ft: 15099 corp: 38/1144b lim: 40 exec/s: 51 rss: 69Mb L: 20/40 MS: 1 ChangeBinInt- 00:07:38.244 [2024-04-25 23:54:27.644759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.644786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.644913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.644931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.645060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.645077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.645205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ff2823ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.645220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.244 #52 NEW cov: 11728 ft: 15102 corp: 39/1180b lim: 40 exec/s: 52 rss: 70Mb L: 36/40 MS: 1 ChangeByte- 00:07:38.244 [2024-04-25 23:54:27.684573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:21312e0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.684600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.684729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffeffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.684747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.684886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.684902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.244 #53 NEW cov: 11728 ft: 15127 corp: 40/1204b lim: 40 exec/s: 53 rss: 70Mb L: 24/40 MS: 1 ChangeBit- 00:07:38.244 [2024-04-25 23:54:27.725124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.725149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.725292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff9b9b9b cdw11:9bffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.725311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.725438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffff2800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.725456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.725587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.725603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.725735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ff23ffff cdw11:ffff2c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.725753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:38.244 #54 NEW cov: 11728 ft: 15219 corp: 41/1244b lim: 40 exec/s: 54 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:38.244 [2024-04-25 23:54:27.765023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.765049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.765170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff1010 cdw11:10101010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.765186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.244 [2024-04-25 23:54:27.765309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.244 [2024-04-25 23:54:27.765325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.245 [2024-04-25 23:54:27.765448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.245 [2024-04-25 23:54:27.765481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.245 #55 NEW cov: 11728 ft: 15233 corp: 42/1279b lim: 40 exec/s: 27 rss: 70Mb L: 35/40 MS: 1 CopyPart- 00:07:38.245 #55 DONE cov: 11728 ft: 15233 corp: 42/1279b lim: 40 exec/s: 27 rss: 70Mb 00:07:38.245 ###### Recommended dictionary. ###### 00:07:38.245 "\377\377\377\377" # Uses: 1 00:07:38.245 ###### End of recommended dictionary. ###### 00:07:38.245 Done 55 runs in 2 second(s) 00:07:38.504 23:54:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:38.504 23:54:27 -- ../common.sh@72 -- # (( i++ )) 00:07:38.504 23:54:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.504 23:54:27 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:38.504 23:54:27 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:38.504 23:54:27 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.504 23:54:27 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.504 23:54:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:38.504 23:54:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:38.504 23:54:27 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:38.504 23:54:27 -- nvmf/run.sh@29 -- # port=4412 00:07:38.504 23:54:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:38.504 23:54:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:38.504 23:54:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.504 23:54:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:38.504 [2024-04-25 23:54:27.944262] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:38.504 [2024-04-25 23:54:27.944332] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476979 ] 00:07:38.504 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.764 [2024-04-25 23:54:28.121443] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.764 [2024-04-25 23:54:28.140789] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:38.764 [2024-04-25 23:54:28.140915] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.764 [2024-04-25 23:54:28.192363] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.764 [2024-04-25 23:54:28.208622] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:38.764 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.764 INFO: Seed: 944630881 00:07:38.764 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:38.764 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:38.764 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:38.764 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.764 #2 INITED exec/s: 0 rss: 59Mb 00:07:38.764 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.764 This may also happen if the target rejected all inputs we tried so far 00:07:38.764 [2024-04-25 23:54:28.256843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.764 [2024-04-25 23:54:28.256871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.023 NEW_FUNC[1/664]: 0x4adea0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:39.023 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.023 #3 NEW cov: 11483 ft: 11498 corp: 2/13b lim: 40 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:07:39.023 [2024-04-25 23:54:28.567689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.023 [2024-04-25 23:54:28.567721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.023 #4 NEW cov: 11610 ft: 11995 corp: 3/25b lim: 40 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 CopyPart- 00:07:39.023 [2024-04-25 23:54:28.607698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.023 [2024-04-25 23:54:28.607723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.023 #5 NEW cov: 11618 ft: 12219 corp: 4/37b lim: 40 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 CopyPart- 00:07:39.282 [2024-04-25 23:54:28.647803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:686f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.282 [2024-04-25 23:54:28.647829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.282 #9 NEW cov: 11703 ft: 12514 corp: 5/51b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 4 ChangeBit-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:39.282 [2024-04-25 23:54:28.687939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.282 [2024-04-25 23:54:28.687965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.282 #10 NEW cov: 11703 ft: 12622 corp: 6/63b lim: 40 exec/s: 0 rss: 67Mb L: 12/14 MS: 1 CopyPart- 00:07:39.282 [2024-04-25 23:54:28.728085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.282 [2024-04-25 23:54:28.728112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.282 #11 NEW cov: 11703 ft: 12715 corp: 7/75b lim: 40 exec/s: 0 rss: 67Mb L: 12/14 MS: 1 CopyPart- 00:07:39.282 [2024-04-25 23:54:28.768209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.282 [2024-04-25 23:54:28.768236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.282 #12 NEW cov: 11703 ft: 12781 corp: 8/87b lim: 40 exec/s: 0 rss: 67Mb L: 12/14 MS: 1 ChangeBit- 00:07:39.282 [2024-04-25 23:54:28.808288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.282 [2024-04-25 23:54:28.808314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.282 #13 NEW cov: 11703 ft: 12846 corp: 9/99b lim: 40 exec/s: 0 rss: 67Mb L: 12/14 MS: 1 ShuffleBytes- 00:07:39.283 [2024-04-25 23:54:28.848495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.283 [2024-04-25 23:54:28.848521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.283 #14 NEW cov: 11703 ft: 12873 corp: 10/111b lim: 40 exec/s: 0 rss: 68Mb L: 12/14 MS: 1 CrossOver- 00:07:39.283 [2024-04-25 23:54:28.888470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2c0a9595 cdw11:95951595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.283 [2024-04-25 23:54:28.888498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 #18 NEW cov: 11703 ft: 12891 corp: 11/119b lim: 40 exec/s: 0 rss: 69Mb L: 8/14 MS: 4 EraseBytes-ChangeBit-CopyPart-InsertByte- 00:07:39.542 [2024-04-25 23:54:28.928625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2c95950a cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:28.928650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 #21 NEW cov: 11703 ft: 12904 corp: 12/129b lim: 40 exec/s: 0 rss: 69Mb L: 10/14 MS: 3 EraseBytes-CrossOver-CrossOver- 00:07:39.542 [2024-04-25 23:54:28.968756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a969595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:28.968782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 #22 NEW cov: 11703 ft: 12938 corp: 13/141b lim: 40 exec/s: 0 rss: 69Mb L: 12/14 MS: 1 ChangeBinInt- 00:07:39.542 [2024-04-25 23:54:29.008864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:29.008889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 #23 NEW cov: 11703 ft: 13010 corp: 14/153b lim: 40 exec/s: 0 rss: 69Mb L: 12/14 MS: 1 ShuffleBytes- 00:07:39.542 [2024-04-25 23:54:29.048952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:29.048978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 #24 NEW cov: 11703 ft: 13133 corp: 15/165b lim: 40 exec/s: 0 rss: 69Mb L: 12/14 MS: 1 ShuffleBytes- 00:07:39.542 [2024-04-25 23:54:29.089064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:29.089089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 #25 NEW cov: 11703 ft: 13152 corp: 16/177b lim: 40 exec/s: 0 rss: 69Mb L: 12/14 MS: 1 ChangeBinInt- 00:07:39.542 [2024-04-25 23:54:29.119299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:29.119324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.542 [2024-04-25 23:54:29.119380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.542 [2024-04-25 23:54:29.119399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.542 #26 NEW cov: 11703 ft: 13852 corp: 17/193b lim: 40 exec/s: 0 rss: 69Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:39.802 [2024-04-25 23:54:29.159258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:956b6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.159284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:39.802 #27 NEW cov: 11726 ft: 13982 corp: 18/205b lim: 40 exec/s: 0 rss: 69Mb L: 12/16 MS: 1 ChangeBinInt- 00:07:39.802 [2024-04-25 23:54:29.199413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:6a6a7395 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.199441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 #28 NEW cov: 11726 ft: 14025 corp: 19/215b lim: 40 exec/s: 0 rss: 69Mb L: 10/16 MS: 1 EraseBytes- 00:07:39.802 [2024-04-25 23:54:29.239522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:9595950c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.239546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 #29 NEW cov: 11726 ft: 14040 corp: 20/227b lim: 40 exec/s: 29 rss: 69Mb L: 12/16 MS: 1 ChangeBinInt- 00:07:39.802 [2024-04-25 23:54:29.269629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.269654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 #30 NEW cov: 11726 ft: 14050 corp: 21/236b lim: 40 exec/s: 30 rss: 70Mb L: 9/16 MS: 1 EraseBytes- 00:07:39.802 [2024-04-25 23:54:29.299665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95950a95 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.299690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 #31 NEW cov: 11726 ft: 14063 corp: 22/249b lim: 40 exec/s: 31 rss: 70Mb L: 13/16 MS: 1 CrossOver- 00:07:39.802 [2024-04-25 23:54:29.339779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:958f9595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.339804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 #32 NEW cov: 11726 ft: 14098 corp: 23/261b lim: 40 exec/s: 32 rss: 70Mb L: 12/16 MS: 1 ChangeBinInt- 00:07:39.802 [2024-04-25 23:54:29.369865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.369889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.802 #35 NEW cov: 11726 ft: 14109 corp: 24/275b lim: 40 exec/s: 35 rss: 70Mb L: 14/16 MS: 3 EraseBytes-CMP-CMP- DE: "\021\000\000\000"-"\377\377\377\377\377\377\377\016"- 00:07:39.802 [2024-04-25 23:54:29.409997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:23ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.802 [2024-04-25 23:54:29.410024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.062 #36 NEW cov: 11726 ft: 14140 corp: 25/290b lim: 40 exec/s: 36 rss: 70Mb L: 15/16 MS: 1 InsertByte- 00:07:40.062 [2024-04-25 23:54:29.450106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2c951100 cdw11:0000950a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.450131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.062 #37 NEW cov: 11726 ft: 14146 corp: 26/304b lim: 40 exec/s: 37 rss: 70Mb L: 14/16 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:07:40.062 [2024-04-25 23:54:29.490539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.490564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.062 [2024-04-25 23:54:29.490621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.490637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.062 [2024-04-25 23:54:29.490693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00009595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.490707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.062 #38 NEW cov: 11726 ft: 14463 corp: 27/333b lim: 40 exec/s: 38 rss: 70Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:40.062 [2024-04-25 23:54:29.530333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a95950a cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.530359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.062 #39 NEW cov: 11726 ft: 14476 corp: 28/342b lim: 40 exec/s: 39 rss: 70Mb L: 9/29 MS: 1 CrossOver- 00:07:40.062 [2024-04-25 23:54:29.570469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2c951100 cdw11:0008950a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.570494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.062 #40 NEW cov: 11726 ft: 14479 corp: 29/356b lim: 40 exec/s: 40 rss: 70Mb L: 14/29 MS: 1 ChangeBit- 00:07:40.062 [2024-04-25 23:54:29.610739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.610764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.062 [2024-04-25 23:54:29.610821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:958f9595 cdw11:95959795 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.610835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.062 #41 NEW cov: 11726 ft: 14498 corp: 30/379b lim: 40 exec/s: 41 rss: 70Mb L: 23/29 MS: 1 CopyPart- 00:07:40.062 [2024-04-25 23:54:29.650658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:686f6f6f cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.062 [2024-04-25 23:54:29.650682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #42 NEW cov: 11726 ft: 14502 corp: 31/393b lim: 40 exec/s: 42 rss: 70Mb L: 14/29 MS: 1 ShuffleBytes- 00:07:40.321 [2024-04-25 23:54:29.690803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959508 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.690828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #43 NEW cov: 11726 ft: 14513 corp: 32/402b lim: 40 exec/s: 43 rss: 70Mb L: 9/29 MS: 1 ChangeBit- 00:07:40.321 [2024-04-25 23:54:29.730909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffff41 cdw11:23ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.730934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #44 NEW cov: 11726 ft: 14531 corp: 33/417b lim: 40 exec/s: 44 rss: 70Mb L: 15/29 MS: 1 ChangeByte- 00:07:40.321 [2024-04-25 23:54:29.771048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2c95950a cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.771073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #45 NEW cov: 11726 ft: 14559 corp: 34/427b lim: 40 exec/s: 45 rss: 70Mb L: 10/29 MS: 1 ChangeByte- 00:07:40.321 [2024-04-25 23:54:29.801114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a969595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.801142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #46 NEW cov: 11726 ft: 14565 corp: 35/439b lim: 40 exec/s: 46 rss: 70Mb L: 12/29 MS: 1 ChangeBit- 00:07:40.321 [2024-04-25 23:54:29.841250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a950c00 cdw11:00009595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.841276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #47 NEW cov: 11726 ft: 14571 corp: 36/451b lim: 40 exec/s: 47 rss: 70Mb L: 12/29 MS: 1 ChangeBinInt- 00:07:40.321 [2024-04-25 23:54:29.881355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:95959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.881380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.321 #48 NEW cov: 11726 ft: 14574 corp: 37/463b lim: 40 exec/s: 48 rss: 70Mb L: 12/29 MS: 1 CopyPart- 00:07:40.321 [2024-04-25 23:54:29.911425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.321 [2024-04-25 23:54:29.911451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 #49 NEW cov: 11726 ft: 14583 corp: 38/472b lim: 40 exec/s: 49 rss: 70Mb L: 9/29 MS: 1 ChangeByte- 00:07:40.580 [2024-04-25 23:54:29.951736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:29.951760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 [2024-04-25 23:54:29.951818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:77155dda cdw11:a88e1695 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:29.951832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.580 [2024-04-25 23:54:29.991845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:29.991870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 [2024-04-25 23:54:29.991928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a959595 cdw11:958e1695 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:29.991942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.580 #51 NEW cov: 11726 ft: 14609 corp: 39/492b lim: 40 exec/s: 51 rss: 70Mb L: 20/29 MS: 2 CMP-CopyPart- DE: "\000w\025]\332\250\216\026"- 00:07:40.580 [2024-04-25 23:54:30.031797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959508 cdw11:952c9595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:30.031822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 #52 NEW cov: 11726 ft: 14618 corp: 40/501b lim: 40 exec/s: 52 rss: 71Mb L: 9/29 MS: 1 ChangeByte- 00:07:40.580 [2024-04-25 23:54:30.071962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:956b6a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:30.071988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 #53 NEW cov: 11726 ft: 14665 corp: 41/514b lim: 40 exec/s: 53 rss: 71Mb L: 13/29 MS: 1 InsertByte- 00:07:40.580 [2024-04-25 23:54:30.112083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:95959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:30.112108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 #54 NEW cov: 11726 ft: 14676 corp: 42/526b lim: 40 exec/s: 54 rss: 71Mb L: 12/29 MS: 1 ChangeBit- 00:07:40.580 [2024-04-25 23:54:30.152194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2c95ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.580 [2024-04-25 23:54:30.152219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.580 #55 NEW cov: 11726 ft: 14699 corp: 43/536b lim: 40 exec/s: 55 rss: 71Mb L: 10/29 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\016"- 00:07:40.871 [2024-04-25 23:54:30.192247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a959595 cdw11:15959595 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.871 [2024-04-25 23:54:30.192273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.871 #56 NEW cov: 11726 ft: 14700 corp: 44/548b lim: 40 exec/s: 56 rss: 71Mb L: 12/29 MS: 1 ChangeBit- 00:07:40.871 [2024-04-25 23:54:30.232412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:11000000 cdw11:6f6f6f6f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.871 [2024-04-25 23:54:30.232437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.871 #57 NEW cov: 11726 ft: 14704 corp: 45/562b lim: 40 exec/s: 28 rss: 71Mb L: 14/29 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:07:40.871 #57 DONE cov: 11726 ft: 14704 corp: 45/562b lim: 40 exec/s: 28 rss: 71Mb 00:07:40.871 ###### Recommended dictionary. ###### 00:07:40.871 "\021\000\000\000" # Uses: 2 00:07:40.871 "\377\377\377\377\377\377\377\016" # Uses: 1 00:07:40.871 "\000w\025]\332\250\216\026" # Uses: 0 00:07:40.871 ###### End of recommended dictionary. ###### 00:07:40.871 Done 57 runs in 2 second(s) 00:07:40.871 23:54:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:40.871 23:54:30 -- ../common.sh@72 -- # (( i++ )) 00:07:40.871 23:54:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.871 23:54:30 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:40.871 23:54:30 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:40.871 23:54:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.871 23:54:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.871 23:54:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:40.871 23:54:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:40.871 23:54:30 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:40.871 23:54:30 -- nvmf/run.sh@29 -- # port=4413 00:07:40.871 23:54:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:40.871 23:54:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:40.871 23:54:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.871 23:54:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:40.871 [2024-04-25 23:54:30.413503] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:40.871 [2024-04-25 23:54:30.413576] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477351 ] 00:07:40.871 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.131 [2024-04-25 23:54:30.593929] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.131 [2024-04-25 23:54:30.613320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:41.131 [2024-04-25 23:54:30.613455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.131 [2024-04-25 23:54:30.665132] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.131 [2024-04-25 23:54:30.681440] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:41.131 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.131 INFO: Seed: 3419645701 00:07:41.131 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:41.131 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:41.131 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:41.131 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.131 #2 INITED exec/s: 0 rss: 60Mb 00:07:41.131 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.131 This may also happen if the target rejected all inputs we tried so far 00:07:41.391 [2024-04-25 23:54:30.747899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.391 [2024-04-25 23:54:30.747932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.391 [2024-04-25 23:54:30.748071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.391 [2024-04-25 23:54:30.748090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.650 NEW_FUNC[1/661]: 0x4afa60 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:41.650 NEW_FUNC[2/661]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.650 #4 NEW cov: 11485 ft: 11483 corp: 2/19b lim: 40 exec/s: 0 rss: 67Mb L: 18/18 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:41.650 [2024-04-25 23:54:31.088469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.088509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.650 [2024-04-25 23:54:31.088648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.088667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.650 NEW_FUNC[1/2]: 0xed42e0 in spdk_get_ticks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:296 00:07:41.650 NEW_FUNC[2/2]: 0xed4340 in rte_get_timer_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic/rte_cycles.h:94 00:07:41.650 #10 NEW cov: 11600 ft: 12144 corp: 3/38b lim: 40 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 CrossOver- 00:07:41.650 [2024-04-25 23:54:31.148569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a57d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.148595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.650 [2024-04-25 23:54:31.148728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.148746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.650 #11 NEW cov: 11606 ft: 12394 corp: 4/56b lim: 40 exec/s: 0 rss: 67Mb L: 18/19 MS: 1 ChangeByte- 00:07:41.650 [2024-04-25 23:54:31.188965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.188994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.650 [2024-04-25 23:54:31.189138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.189157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.650 [2024-04-25 23:54:31.189287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.189303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.650 #12 NEW cov: 11691 ft: 12912 corp: 5/81b lim: 40 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:07:41.650 [2024-04-25 23:54:31.249005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a57d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.249033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.650 [2024-04-25 23:54:31.249159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5ada5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.650 [2024-04-25 23:54:31.249175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.910 #13 NEW cov: 11691 ft: 13103 corp: 6/99b lim: 40 exec/s: 0 rss: 68Mb L: 18/25 MS: 1 ChangeBit- 00:07:41.910 [2024-04-25 23:54:31.288736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.288762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.910 [2024-04-25 23:54:31.288885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a50a cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.288903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.910 #14 NEW cov: 11691 ft: 13195 corp: 7/120b lim: 40 exec/s: 0 rss: 68Mb L: 21/25 MS: 1 CrossOver- 00:07:41.910 [2024-04-25 23:54:31.339154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.339181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.910 [2024-04-25 23:54:31.339324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d7d7d7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.339341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.910 #17 NEW cov: 11691 ft: 13253 corp: 8/140b lim: 40 exec/s: 0 rss: 68Mb L: 20/25 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:41.910 [2024-04-25 23:54:31.378832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.378858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.910 [2024-04-25 23:54:31.378976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d7d7d7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.378994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.910 #18 NEW cov: 11691 ft: 13332 corp: 9/156b lim: 40 exec/s: 0 rss: 68Mb L: 16/25 MS: 1 EraseBytes- 00:07:41.910 [2024-04-25 23:54:31.419205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.419232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.910 #19 NEW cov: 11691 ft: 13708 corp: 10/170b lim: 40 exec/s: 0 rss: 68Mb L: 14/25 MS: 1 EraseBytes- 00:07:41.910 [2024-04-25 23:54:31.469296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aff7615 cdw11:5eba0534 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.469324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.910 #20 NEW cov: 11691 ft: 13771 corp: 11/179b lim: 40 exec/s: 0 rss: 68Mb L: 9/25 MS: 1 CMP- DE: "\377v\025^\272\0054\336"- 00:07:41.910 [2024-04-25 23:54:31.519783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a57d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.519810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.910 [2024-04-25 23:54:31.519933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5ada5 cdw11:a52aa5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.910 [2024-04-25 23:54:31.519951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.169 #21 NEW cov: 11691 ft: 13813 corp: 12/197b lim: 40 exec/s: 0 rss: 68Mb L: 18/25 MS: 1 ChangeByte- 00:07:42.169 [2024-04-25 23:54:31.570224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aff7615 cdw11:5e000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.169 [2024-04-25 23:54:31.570252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.169 [2024-04-25 23:54:31.570399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.169 [2024-04-25 23:54:31.570416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.169 [2024-04-25 23:54:31.570559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.169 [2024-04-25 23:54:31.570577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.169 #22 NEW cov: 11691 ft: 13847 corp: 13/228b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:42.169 [2024-04-25 23:54:31.630127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7f7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.169 [2024-04-25 23:54:31.630154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.170 [2024-04-25 23:54:31.630303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d7d7d7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.170 [2024-04-25 23:54:31.630319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.170 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:42.170 #23 NEW cov: 11714 ft: 13933 corp: 14/244b lim: 40 exec/s: 0 rss: 68Mb L: 16/31 MS: 1 ChangeBit- 00:07:42.170 [2024-04-25 23:54:31.690298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.170 [2024-04-25 23:54:31.690323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.170 [2024-04-25 23:54:31.690458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a50a cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.170 [2024-04-25 23:54:31.690474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.170 #24 NEW cov: 11714 ft: 13972 corp: 15/265b lim: 40 exec/s: 0 rss: 69Mb L: 21/31 MS: 1 ShuffleBytes- 00:07:42.170 [2024-04-25 23:54:31.740718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a50a cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.170 [2024-04-25 23:54:31.740745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.170 [2024-04-25 23:54:31.740875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a50a cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.170 [2024-04-25 23:54:31.740894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.170 [2024-04-25 23:54:31.741026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a5a5a5a5 cdw11:a50aa5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.170 [2024-04-25 23:54:31.741041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.170 #25 NEW cov: 11714 ft: 13999 corp: 16/296b lim: 40 exec/s: 25 rss: 69Mb L: 31/31 MS: 1 CopyPart- 00:07:42.430 [2024-04-25 23:54:31.790844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0adddddd cdw11:dddddddd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.790872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.430 [2024-04-25 23:54:31.791009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.791026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.430 [2024-04-25 23:54:31.791163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.791181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.430 #26 NEW cov: 11714 ft: 14013 corp: 17/323b lim: 40 exec/s: 26 rss: 69Mb L: 27/31 MS: 1 InsertRepeatedBytes- 00:07:42.430 [2024-04-25 23:54:31.840575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aff7615 cdw11:5eba0534 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.840604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.430 #27 NEW cov: 11714 ft: 14027 corp: 18/332b lim: 40 exec/s: 27 rss: 69Mb L: 9/31 MS: 1 ShuffleBytes- 00:07:42.430 [2024-04-25 23:54:31.890685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.890712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.430 #28 NEW cov: 11714 ft: 14048 corp: 19/346b lim: 40 exec/s: 28 rss: 69Mb L: 14/31 MS: 1 CrossOver- 00:07:42.430 [2024-04-25 23:54:31.940907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.940934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.430 #29 NEW cov: 11714 ft: 14063 corp: 20/360b lim: 40 exec/s: 29 rss: 69Mb L: 14/31 MS: 1 ChangeBit- 00:07:42.430 [2024-04-25 23:54:31.991041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.430 [2024-04-25 23:54:31.991070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.430 #30 NEW cov: 11714 ft: 14082 corp: 21/371b lim: 40 exec/s: 30 rss: 69Mb L: 11/31 MS: 1 EraseBytes- 00:07:42.690 [2024-04-25 23:54:32.041414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a50aa5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.041442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.690 [2024-04-25 23:54:32.041579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a560 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.041600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.690 #31 NEW cov: 11714 ft: 14103 corp: 22/387b lim: 40 exec/s: 31 rss: 69Mb L: 16/31 MS: 1 EraseBytes- 00:07:42.690 [2024-04-25 23:54:32.091866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.091893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.690 [2024-04-25 23:54:32.092040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.092057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.690 [2024-04-25 23:54:32.092195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a5a50aa5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.092215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.690 #32 NEW cov: 11714 ft: 14132 corp: 23/415b lim: 40 exec/s: 32 rss: 69Mb L: 28/31 MS: 1 CopyPart- 00:07:42.690 [2024-04-25 23:54:32.141828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a57d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.141856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.690 [2024-04-25 23:54:32.141981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5adada5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.141999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.690 #33 NEW cov: 11714 ft: 14151 corp: 24/433b lim: 40 exec/s: 33 rss: 69Mb L: 18/31 MS: 1 ChangeBit- 00:07:42.690 [2024-04-25 23:54:32.191953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:d74d4d4d cdw11:4d4d4d4d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.191981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.690 [2024-04-25 23:54:32.192109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:4d4d4d4d cdw11:4d4d4d4d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.192130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.690 #38 NEW cov: 11714 ft: 14155 corp: 25/454b lim: 40 exec/s: 38 rss: 69Mb L: 21/31 MS: 5 CrossOver-CrossOver-CopyPart-CrossOver-InsertRepeatedBytes- 00:07:42.690 [2024-04-25 23:54:32.241816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3b0a0ad7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.690 [2024-04-25 23:54:32.241846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.690 #39 NEW cov: 11714 ft: 14183 corp: 26/466b lim: 40 exec/s: 39 rss: 69Mb L: 12/31 MS: 1 InsertByte- 00:07:42.950 [2024-04-25 23:54:32.302095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.302123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.950 #40 NEW cov: 11714 ft: 14188 corp: 27/480b lim: 40 exec/s: 40 rss: 69Mb L: 14/31 MS: 1 CopyPart- 00:07:42.950 [2024-04-25 23:54:32.352465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0adddddd cdw11:dddddddd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.352490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.950 [2024-04-25 23:54:32.352626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ddddff76 cdw11:155eba05 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.352642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.950 #41 NEW cov: 11714 ft: 14202 corp: 28/498b lim: 40 exec/s: 41 rss: 69Mb L: 18/31 MS: 1 CrossOver- 00:07:42.950 [2024-04-25 23:54:32.392607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.392633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.950 [2024-04-25 23:54:32.392754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a593 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.392787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.950 #42 NEW cov: 11714 ft: 14209 corp: 29/517b lim: 40 exec/s: 42 rss: 69Mb L: 19/31 MS: 1 ChangeByte- 00:07:42.950 [2024-04-25 23:54:32.442626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50aa5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.442652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.950 [2024-04-25 23:54:32.442776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a50a cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.442793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.950 #43 NEW cov: 11714 ft: 14222 corp: 30/535b lim: 40 exec/s: 43 rss: 69Mb L: 18/31 MS: 1 EraseBytes- 00:07:42.950 [2024-04-25 23:54:32.492923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.492949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.950 [2024-04-25 23:54:32.493083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a57da5 cdw11:a5ada5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.493098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.950 #44 NEW cov: 11714 ft: 14288 corp: 31/553b lim: 40 exec/s: 44 rss: 69Mb L: 18/31 MS: 1 CopyPart- 00:07:42.950 [2024-04-25 23:54:32.542752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.950 [2024-04-25 23:54:32.542779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.210 #45 NEW cov: 11714 ft: 14293 corp: 32/564b lim: 40 exec/s: 45 rss: 69Mb L: 11/31 MS: 1 ShuffleBytes- 00:07:43.210 [2024-04-25 23:54:32.582938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aff7615 cdw11:5eba0532 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.210 [2024-04-25 23:54:32.582965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.210 #46 NEW cov: 11714 ft: 14315 corp: 33/573b lim: 40 exec/s: 46 rss: 69Mb L: 9/31 MS: 1 ChangeASCIIInt- 00:07:43.210 [2024-04-25 23:54:32.633075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0ad7d7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.210 [2024-04-25 23:54:32.633100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.210 #47 NEW cov: 11714 ft: 14333 corp: 34/584b lim: 40 exec/s: 47 rss: 69Mb L: 11/31 MS: 1 CopyPart- 00:07:43.210 [2024-04-25 23:54:32.683398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0a0c0c cdw11:0c0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.210 [2024-04-25 23:54:32.683439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.210 [2024-04-25 23:54:32.683583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0c0c0cd7 cdw11:d7d7d7d7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.210 [2024-04-25 23:54:32.683599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.210 #48 NEW cov: 11714 ft: 14349 corp: 35/607b lim: 40 exec/s: 48 rss: 69Mb L: 23/31 MS: 1 InsertRepeatedBytes- 00:07:43.210 [2024-04-25 23:54:32.733684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a50ae5a5 cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.210 [2024-04-25 23:54:32.733711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.210 [2024-04-25 23:54:32.733857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a5a5a50a cdw11:a5a5a5a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.210 [2024-04-25 23:54:32.733873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.210 #49 NEW cov: 11714 ft: 14409 corp: 36/628b lim: 40 exec/s: 24 rss: 69Mb L: 21/31 MS: 1 ChangeBit- 00:07:43.210 #49 DONE cov: 11714 ft: 14409 corp: 36/628b lim: 40 exec/s: 24 rss: 69Mb 00:07:43.210 ###### Recommended dictionary. ###### 00:07:43.210 "\377v\025^\272\0054\336" # Uses: 0 00:07:43.210 ###### End of recommended dictionary. ###### 00:07:43.210 Done 49 runs in 2 second(s) 00:07:43.469 23:54:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:43.469 23:54:32 -- ../common.sh@72 -- # (( i++ )) 00:07:43.469 23:54:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:43.469 23:54:32 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:43.469 23:54:32 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:43.469 23:54:32 -- nvmf/run.sh@24 -- # local timen=1 00:07:43.469 23:54:32 -- nvmf/run.sh@25 -- # local core=0x1 00:07:43.469 23:54:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:43.469 23:54:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:43.469 23:54:32 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:43.469 23:54:32 -- nvmf/run.sh@29 -- # port=4414 00:07:43.469 23:54:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:43.469 23:54:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:43.469 23:54:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:43.469 23:54:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:43.469 [2024-04-25 23:54:32.908892] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:43.469 [2024-04-25 23:54:32.908983] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477812 ] 00:07:43.469 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.729 [2024-04-25 23:54:33.086845] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.729 [2024-04-25 23:54:33.106045] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.729 [2024-04-25 23:54:33.106169] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.729 [2024-04-25 23:54:33.157523] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.729 [2024-04-25 23:54:33.173775] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:43.729 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.729 INFO: Seed: 1616685415 00:07:43.729 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:43.729 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:43.729 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:43.729 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.729 #2 INITED exec/s: 0 rss: 59Mb 00:07:43.729 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.729 This may also happen if the target rejected all inputs we tried so far 00:07:43.729 [2024-04-25 23:54:33.219187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.729 [2024-04-25 23:54:33.219216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.988 NEW_FUNC[1/666]: 0x4b1620 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:43.988 NEW_FUNC[2/666]: 0x4d29c0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:43.988 #4 NEW cov: 11514 ft: 11515 corp: 2/21b lim: 35 exec/s: 0 rss: 67Mb L: 20/20 MS: 2 CMP-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:07:43.988 [2024-04-25 23:54:33.529838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.988 [2024-04-25 23:54:33.529873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.988 #6 NEW cov: 11634 ft: 12660 corp: 3/28b lim: 35 exec/s: 0 rss: 67Mb L: 7/20 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:43.988 [2024-04-25 23:54:33.570426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.988 [2024-04-25 23:54:33.570453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.988 [2024-04-25 23:54:33.570520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.988 [2024-04-25 23:54:33.570535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.988 [2024-04-25 23:54:33.570598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.988 [2024-04-25 23:54:33.570612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.988 #7 NEW cov: 11640 ft: 13242 corp: 4/62b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CopyPart- 00:07:44.247 [2024-04-25 23:54:33.610554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.610580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.247 [2024-04-25 23:54:33.610640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.610655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.247 [2024-04-25 23:54:33.610715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.610729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.247 [2024-04-25 23:54:33.610791] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.610805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.247 #8 NEW cov: 11725 ft: 13575 corp: 5/92b lim: 35 exec/s: 0 rss: 67Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:07:44.247 [2024-04-25 23:54:33.660244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.660271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.247 #9 NEW cov: 11725 ft: 13839 corp: 6/99b lim: 35 exec/s: 0 rss: 67Mb L: 7/34 MS: 1 ChangeByte- 00:07:44.247 [2024-04-25 23:54:33.700738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.700763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.247 #10 NEW cov: 11725 ft: 14063 corp: 7/120b lim: 35 exec/s: 0 rss: 67Mb L: 21/34 MS: 1 CrossOver- 00:07:44.247 [2024-04-25 23:54:33.740560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.740585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.247 [2024-04-25 23:54:33.740649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.740663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.247 #19 NEW cov: 11725 ft: 14250 corp: 8/136b lim: 35 exec/s: 0 rss: 67Mb L: 16/34 MS: 4 CopyPart-ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:44.247 [2024-04-25 23:54:33.780699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.780725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.247 [2024-04-25 23:54:33.780788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.780802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.247 #20 NEW cov: 11725 ft: 14292 corp: 9/152b lim: 35 exec/s: 0 rss: 68Mb L: 16/34 MS: 1 ChangeByte- 00:07:44.247 [2024-04-25 23:54:33.820896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.820921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.247 [2024-04-25 23:54:33.820984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.247 [2024-04-25 23:54:33.820999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.247 #21 NEW cov: 11725 ft: 14315 corp: 10/168b lim: 35 exec/s: 0 rss: 68Mb L: 16/34 MS: 1 ShuffleBytes- 00:07:44.507 [2024-04-25 23:54:33.860976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.861001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.507 #22 NEW cov: 11725 ft: 14357 corp: 11/188b lim: 35 exec/s: 0 rss: 68Mb L: 20/34 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:44.507 [2024-04-25 23:54:33.900943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.900970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.507 #23 NEW cov: 11725 ft: 14397 corp: 12/196b lim: 35 exec/s: 0 rss: 68Mb L: 8/34 MS: 1 InsertByte- 00:07:44.507 [2024-04-25 23:54:33.941211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.941236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.507 [2024-04-25 23:54:33.941296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.941310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.507 #28 NEW cov: 11725 ft: 14504 corp: 13/211b lim: 35 exec/s: 0 rss: 68Mb L: 15/34 MS: 5 EraseBytes-ChangeBit-CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:44.507 [2024-04-25 23:54:33.981709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.981734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.507 [2024-04-25 23:54:33.981798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.981813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.507 [2024-04-25 23:54:33.981873] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:33.981887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.507 #29 NEW cov: 11725 ft: 14612 corp: 14/245b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:07:44.507 [2024-04-25 23:54:34.021513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:34.021540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.507 #30 NEW cov: 11725 ft: 14639 corp: 15/264b lim: 35 exec/s: 0 rss: 68Mb L: 19/34 MS: 1 EraseBytes- 00:07:44.507 [2024-04-25 23:54:34.061581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:34.061608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.507 [2024-04-25 23:54:34.061668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:34.061683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.507 #34 NEW cov: 11725 ft: 14680 corp: 16/283b lim: 35 exec/s: 0 rss: 68Mb L: 19/34 MS: 4 EraseBytes-EraseBytes-ShuffleBytes-CrossOver- 00:07:44.507 [2024-04-25 23:54:34.101495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000050 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.507 [2024-04-25 23:54:34.101520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.766 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.766 #37 NEW cov: 11748 ft: 14763 corp: 17/292b lim: 35 exec/s: 0 rss: 68Mb L: 9/34 MS: 3 InsertRepeatedBytes-EraseBytes-CrossOver- 00:07:44.766 [2024-04-25 23:54:34.142313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.142339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.142404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.142419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.142482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.142496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.142558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000030 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.142571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.766 #38 NEW cov: 11748 ft: 14842 corp: 18/327b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertByte- 00:07:44.766 [2024-04-25 23:54:34.181899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.181923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.181986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.182001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.766 #39 NEW cov: 11748 ft: 14859 corp: 19/343b lim: 35 exec/s: 0 rss: 68Mb L: 16/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:44.766 [2024-04-25 23:54:34.222037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.222061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.222123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.222140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.766 #45 NEW cov: 11748 ft: 14870 corp: 20/358b lim: 35 exec/s: 45 rss: 68Mb L: 15/35 MS: 1 ChangeByte- 00:07:44.766 [2024-04-25 23:54:34.262512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.262538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.262599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.262613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.262671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.262685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.262746] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.262761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.766 #46 NEW cov: 11748 ft: 14900 corp: 21/388b lim: 35 exec/s: 46 rss: 69Mb L: 30/35 MS: 1 ChangeByte- 00:07:44.766 [2024-04-25 23:54:34.302670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.302696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.302757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.302771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.766 [2024-04-25 23:54:34.302833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.766 [2024-04-25 23:54:34.302847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.767 #47 NEW cov: 11748 ft: 14954 corp: 22/422b lim: 35 exec/s: 47 rss: 69Mb L: 34/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:44.767 [2024-04-25 23:54:34.342216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000007c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.767 [2024-04-25 23:54:34.342243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.767 #48 NEW cov: 11748 ft: 14968 corp: 23/430b lim: 35 exec/s: 48 rss: 69Mb L: 8/35 MS: 1 ShuffleBytes- 00:07:45.026 [2024-04-25 23:54:34.383055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.383082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.383144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.383158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.383219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.383236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.383295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.383310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.026 #49 NEW cov: 11748 ft: 15010 corp: 24/465b lim: 35 exec/s: 49 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:07:45.026 [2024-04-25 23:54:34.422660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.422686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.026 #50 NEW cov: 11748 ft: 15016 corp: 25/484b lim: 35 exec/s: 50 rss: 69Mb L: 19/35 MS: 1 ChangeBit- 00:07:45.026 [2024-04-25 23:54:34.462585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.462612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.026 #51 NEW cov: 11748 ft: 15032 corp: 26/491b lim: 35 exec/s: 51 rss: 69Mb L: 7/35 MS: 1 ShuffleBytes- 00:07:45.026 [2024-04-25 23:54:34.503457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.503486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.503549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.503564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.503625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.503638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.503700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.503714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.026 #52 NEW cov: 11748 ft: 15054 corp: 27/526b lim: 35 exec/s: 52 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:45.026 [2024-04-25 23:54:34.543164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.543192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.543256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.543273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.543331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.543347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.026 #53 NEW cov: 11748 ft: 15091 corp: 28/549b lim: 35 exec/s: 53 rss: 69Mb L: 23/35 MS: 1 InsertRepeatedBytes- 00:07:45.026 [2024-04-25 23:54:34.583075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.583104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.026 [2024-04-25 23:54:34.583165] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.583179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.026 #54 NEW cov: 11748 ft: 15105 corp: 29/563b lim: 35 exec/s: 54 rss: 69Mb L: 14/35 MS: 1 EraseBytes- 00:07:45.026 [2024-04-25 23:54:34.623265] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.026 [2024-04-25 23:54:34.623290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.285 #55 NEW cov: 11748 ft: 15107 corp: 30/582b lim: 35 exec/s: 55 rss: 69Mb L: 19/35 MS: 1 ChangeByte- 00:07:45.285 [2024-04-25 23:54:34.663693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.285 [2024-04-25 23:54:34.663719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.285 [2024-04-25 23:54:34.663780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.285 [2024-04-25 23:54:34.663794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.285 [2024-04-25 23:54:34.663856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.285 [2024-04-25 23:54:34.663870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.286 #56 NEW cov: 11748 ft: 15130 corp: 31/616b lim: 35 exec/s: 56 rss: 69Mb L: 34/35 MS: 1 ChangeBinInt- 00:07:45.286 [2024-04-25 23:54:34.703670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.703695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.286 [2024-04-25 23:54:34.703759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.703773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.286 #57 NEW cov: 11748 ft: 15145 corp: 32/640b lim: 35 exec/s: 57 rss: 69Mb L: 24/35 MS: 1 EraseBytes- 00:07:45.286 [2024-04-25 23:54:34.743866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.743891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.286 [2024-04-25 23:54:34.743953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.743967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.286 [2024-04-25 23:54:34.744027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.744041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.286 #58 NEW cov: 11748 ft: 15168 corp: 33/668b lim: 35 exec/s: 58 rss: 69Mb L: 28/35 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:45.286 [2024-04-25 23:54:34.783957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.783984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.286 [2024-04-25 23:54:34.784048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.784066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.286 [2024-04-25 23:54:34.784127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.784143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.286 [2024-04-25 23:54:34.784204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.784221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.286 #59 NEW cov: 11748 ft: 15190 corp: 34/700b lim: 35 exec/s: 59 rss: 69Mb L: 32/35 MS: 1 CopyPart- 00:07:45.286 [2024-04-25 23:54:34.823813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.823841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.286 #60 NEW cov: 11748 ft: 15200 corp: 35/719b lim: 35 exec/s: 60 rss: 70Mb L: 19/35 MS: 1 EraseBytes- 00:07:45.286 [2024-04-25 23:54:34.863703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.863731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.286 #61 NEW cov: 11748 ft: 15223 corp: 36/730b lim: 35 exec/s: 61 rss: 70Mb L: 11/35 MS: 1 CMP- DE: "\011\000\000\000"- 00:07:45.286 [2024-04-25 23:54:34.894213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.286 [2024-04-25 23:54:34.894239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.545 #62 NEW cov: 11748 ft: 15231 corp: 37/751b lim: 35 exec/s: 62 rss: 70Mb L: 21/35 MS: 1 ChangeByte- 00:07:45.545 [2024-04-25 23:54:34.934512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:34.934538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:34.934597] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:34.934612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:34.934674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:34.934689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.545 #63 NEW cov: 11748 ft: 15246 corp: 38/779b lim: 35 exec/s: 63 rss: 70Mb L: 28/35 MS: 1 EraseBytes- 00:07:45.545 [2024-04-25 23:54:34.974225] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:34.974253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.545 #64 NEW cov: 11748 ft: 15283 corp: 39/798b lim: 35 exec/s: 64 rss: 70Mb L: 19/35 MS: 1 ChangeBinInt- 00:07:45.545 [2024-04-25 23:54:35.014832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.014861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:35.014930] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.014944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:35.015008] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.015023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:35.015084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.015099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.545 #65 NEW cov: 11748 ft: 15327 corp: 40/833b lim: 35 exec/s: 65 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:45.545 [2024-04-25 23:54:35.055035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.055063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:35.055128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.055143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:35.055205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.055219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.545 [2024-04-25 23:54:35.055282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.055297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.545 #66 NEW cov: 11748 ft: 15339 corp: 41/868b lim: 35 exec/s: 66 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:45.545 [2024-04-25 23:54:35.094433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.094462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.545 #67 NEW cov: 11748 ft: 15365 corp: 42/875b lim: 35 exec/s: 67 rss: 70Mb L: 7/35 MS: 1 ShuffleBytes- 00:07:45.545 [2024-04-25 23:54:35.134506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000d0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.545 [2024-04-25 23:54:35.134532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.545 #68 NEW cov: 11748 ft: 15366 corp: 43/886b lim: 35 exec/s: 68 rss: 70Mb L: 11/35 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:45.805 [2024-04-25 23:54:35.175181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.805 [2024-04-25 23:54:35.175206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.805 [2024-04-25 23:54:35.175264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.805 [2024-04-25 23:54:35.175278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.805 [2024-04-25 23:54:35.175343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.805 [2024-04-25 23:54:35.175358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.805 #69 NEW cov: 11748 ft: 15427 corp: 44/914b lim: 35 exec/s: 69 rss: 70Mb L: 28/35 MS: 1 ChangeByte- 00:07:45.805 [2024-04-25 23:54:35.214930] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.805 [2024-04-25 23:54:35.214955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.805 [2024-04-25 23:54:35.215015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.805 [2024-04-25 23:54:35.215029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.805 #70 NEW cov: 11748 ft: 15434 corp: 45/930b lim: 35 exec/s: 35 rss: 70Mb L: 16/35 MS: 1 ChangeByte- 00:07:45.805 #70 DONE cov: 11748 ft: 15434 corp: 45/930b lim: 35 exec/s: 35 rss: 70Mb 00:07:45.805 ###### Recommended dictionary. ###### 00:07:45.805 "\000\000\000\000" # Uses: 4 00:07:45.805 "\000\000\000\000\000\000\000\000" # Uses: 1 00:07:45.805 "\011\000\000\000" # Uses: 0 00:07:45.805 ###### End of recommended dictionary. ###### 00:07:45.805 Done 70 runs in 2 second(s) 00:07:45.805 23:54:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:45.805 23:54:35 -- ../common.sh@72 -- # (( i++ )) 00:07:45.805 23:54:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.805 23:54:35 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:45.805 23:54:35 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:45.805 23:54:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.805 23:54:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.805 23:54:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:45.805 23:54:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:45.805 23:54:35 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:45.805 23:54:35 -- nvmf/run.sh@29 -- # port=4415 00:07:45.805 23:54:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:45.805 23:54:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:45.805 23:54:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.805 23:54:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:45.805 [2024-04-25 23:54:35.396779] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:45.805 [2024-04-25 23:54:35.396873] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478347 ] 00:07:46.064 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.064 [2024-04-25 23:54:35.574555] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.064 [2024-04-25 23:54:35.593362] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.064 [2024-04-25 23:54:35.593488] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.064 [2024-04-25 23:54:35.644945] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.064 [2024-04-25 23:54:35.661245] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:46.064 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.064 INFO: Seed: 4101686231 00:07:46.322 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:46.322 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:46.322 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:46.322 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.322 #2 INITED exec/s: 0 rss: 60Mb 00:07:46.322 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.322 This may also happen if the target rejected all inputs we tried so far 00:07:46.322 [2024-04-25 23:54:35.709720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.322 [2024-04-25 23:54:35.709750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.581 NEW_FUNC[1/663]: 0x4b2b60 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:46.581 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.581 #6 NEW cov: 11469 ft: 11469 corp: 2/11b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 4 ShuffleBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:46.581 NEW_FUNC[1/1]: 0x4d2e90 in feat_async_event_cfg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:346 00:07:46.581 #9 NEW cov: 11686 ft: 12121 corp: 3/24b lim: 35 exec/s: 0 rss: 67Mb L: 13/13 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:46.582 [2024-04-25 23:54:36.060537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.582 [2024-04-25 23:54:36.060571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.582 #10 NEW cov: 11692 ft: 12296 corp: 4/35b lim: 35 exec/s: 0 rss: 67Mb L: 11/13 MS: 1 InsertByte- 00:07:46.582 [2024-04-25 23:54:36.100671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.582 [2024-04-25 23:54:36.100699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.582 #11 NEW cov: 11777 ft: 12517 corp: 5/46b lim: 35 exec/s: 0 rss: 67Mb L: 11/13 MS: 1 ChangeBit- 00:07:46.582 [2024-04-25 23:54:36.140804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.582 [2024-04-25 23:54:36.140829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.582 #12 NEW cov: 11777 ft: 12551 corp: 6/57b lim: 35 exec/s: 0 rss: 67Mb L: 11/13 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:46.582 [2024-04-25 23:54:36.170941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.582 [2024-04-25 23:54:36.170966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.582 [2024-04-25 23:54:36.171023] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.582 [2024-04-25 23:54:36.171038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.841 #13 NEW cov: 11777 ft: 12941 corp: 7/76b lim: 35 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CMP- DE: "\001\004\000\000\000\000\000\000"- 00:07:46.841 [2024-04-25 23:54:36.210947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.841 [2024-04-25 23:54:36.210973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.841 #14 NEW cov: 11777 ft: 12985 corp: 8/86b lim: 35 exec/s: 0 rss: 68Mb L: 10/19 MS: 1 ChangeBit- 00:07:46.841 [2024-04-25 23:54:36.251020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.841 [2024-04-25 23:54:36.251048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.841 #15 NEW cov: 11777 ft: 13030 corp: 9/96b lim: 35 exec/s: 0 rss: 68Mb L: 10/19 MS: 1 ChangeBinInt- 00:07:46.841 [2024-04-25 23:54:36.291224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.841 [2024-04-25 23:54:36.291249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.841 #16 NEW cov: 11777 ft: 13084 corp: 10/107b lim: 35 exec/s: 0 rss: 68Mb L: 11/19 MS: 1 ChangeBinInt- 00:07:46.841 [2024-04-25 23:54:36.331269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.841 [2024-04-25 23:54:36.331296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.841 #17 NEW cov: 11777 ft: 13138 corp: 11/118b lim: 35 exec/s: 0 rss: 68Mb L: 11/19 MS: 1 ChangeByte- 00:07:46.841 #18 NEW cov: 11777 ft: 13229 corp: 12/131b lim: 35 exec/s: 0 rss: 68Mb L: 13/19 MS: 1 ChangeByte- 00:07:46.841 #19 NEW cov: 11777 ft: 13276 corp: 13/144b lim: 35 exec/s: 0 rss: 68Mb L: 13/19 MS: 1 ChangeByte- 00:07:46.841 [2024-04-25 23:54:36.451889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.841 [2024-04-25 23:54:36.451916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.841 [2024-04-25 23:54:36.451976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.841 [2024-04-25 23:54:36.451990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.100 [2024-04-25 23:54:36.452049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000001b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.100 [2024-04-25 23:54:36.452062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.100 #20 NEW cov: 11777 ft: 13546 corp: 14/166b lim: 35 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 CrossOver- 00:07:47.100 [2024-04-25 23:54:36.491696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.100 [2024-04-25 23:54:36.491723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.100 #21 NEW cov: 11777 ft: 13574 corp: 15/177b lim: 35 exec/s: 0 rss: 68Mb L: 11/22 MS: 1 ChangeByte- 00:07:47.100 #22 NEW cov: 11777 ft: 13634 corp: 16/190b lim: 35 exec/s: 0 rss: 68Mb L: 13/22 MS: 1 ShuffleBytes- 00:07:47.100 #23 NEW cov: 11777 ft: 13653 corp: 17/203b lim: 35 exec/s: 0 rss: 68Mb L: 13/22 MS: 1 ChangeByte- 00:07:47.100 [2024-04-25 23:54:36.592017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.100 [2024-04-25 23:54:36.592043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.100 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.100 #24 NEW cov: 11800 ft: 13683 corp: 18/214b lim: 35 exec/s: 0 rss: 68Mb L: 11/22 MS: 1 ChangeByte- 00:07:47.100 #25 NEW cov: 11800 ft: 13710 corp: 19/227b lim: 35 exec/s: 0 rss: 68Mb L: 13/22 MS: 1 ChangeBit- 00:07:47.100 [2024-04-25 23:54:36.672259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.100 [2024-04-25 23:54:36.672285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.100 #26 NEW cov: 11800 ft: 13715 corp: 20/237b lim: 35 exec/s: 0 rss: 68Mb L: 10/22 MS: 1 CMP- DE: "\001\037"- 00:07:47.359 #27 NEW cov: 11800 ft: 13732 corp: 21/250b lim: 35 exec/s: 27 rss: 69Mb L: 13/22 MS: 1 CopyPart- 00:07:47.359 [2024-04-25 23:54:36.742821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.742848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.359 #28 NEW cov: 11800 ft: 13767 corp: 22/267b lim: 35 exec/s: 28 rss: 69Mb L: 17/22 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:47.359 [2024-04-25 23:54:36.782929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.782955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.783011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.783026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.783082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.783095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.783135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.783150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.359 #30 NEW cov: 11800 ft: 14203 corp: 23/299b lim: 35 exec/s: 30 rss: 69Mb L: 32/32 MS: 2 PersAutoDict-InsertRepeatedBytes- DE: "\001\037"- 00:07:47.359 [2024-04-25 23:54:36.823010] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.823036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.823093] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.823108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.823162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.823176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.823232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.823245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.359 #31 NEW cov: 11800 ft: 14214 corp: 24/331b lim: 35 exec/s: 31 rss: 69Mb L: 32/32 MS: 1 CrossOver- 00:07:47.359 #32 NEW cov: 11800 ft: 14338 corp: 25/344b lim: 35 exec/s: 32 rss: 69Mb L: 13/32 MS: 1 ChangeBit- 00:07:47.359 [2024-04-25 23:54:36.902909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.902934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.359 #33 NEW cov: 11800 ft: 14428 corp: 26/355b lim: 35 exec/s: 33 rss: 69Mb L: 11/32 MS: 1 ChangeByte- 00:07:47.359 [2024-04-25 23:54:36.943381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.943414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.359 [2024-04-25 23:54:36.943472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.359 [2024-04-25 23:54:36.943486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.618 #34 NEW cov: 11800 ft: 14452 corp: 27/376b lim: 35 exec/s: 34 rss: 69Mb L: 21/32 MS: 1 CrossOver- 00:07:47.618 #35 NEW cov: 11800 ft: 14471 corp: 28/389b lim: 35 exec/s: 35 rss: 69Mb L: 13/32 MS: 1 CMP- DE: "\001\000\000\000\002\012)\265"- 00:07:47.618 [2024-04-25 23:54:37.013526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.618 [2024-04-25 23:54:37.013551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.618 [2024-04-25 23:54:37.013610] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.618 [2024-04-25 23:54:37.013624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.618 NEW_FUNC[1/1]: 0x4cbe90 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:47.618 #36 NEW cov: 11838 ft: 14524 corp: 29/410b lim: 35 exec/s: 36 rss: 69Mb L: 21/32 MS: 1 PersAutoDict- DE: "\001\037"- 00:07:47.618 #37 NEW cov: 11838 ft: 14601 corp: 30/423b lim: 35 exec/s: 37 rss: 69Mb L: 13/32 MS: 1 CopyPart- 00:07:47.618 [2024-04-25 23:54:37.083449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.618 [2024-04-25 23:54:37.083475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.618 #38 NEW cov: 11838 ft: 14634 corp: 31/434b lim: 35 exec/s: 38 rss: 69Mb L: 11/32 MS: 1 ChangeBinInt- 00:07:47.618 [2024-04-25 23:54:37.123537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.618 [2024-04-25 23:54:37.123563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.618 #44 NEW cov: 11838 ft: 14651 corp: 32/445b lim: 35 exec/s: 44 rss: 69Mb L: 11/32 MS: 1 ChangeBit- 00:07:47.618 #45 NEW cov: 11838 ft: 14667 corp: 33/458b lim: 35 exec/s: 45 rss: 69Mb L: 13/32 MS: 1 ChangeBinInt- 00:07:47.619 [2024-04-25 23:54:37.193800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.619 [2024-04-25 23:54:37.193826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.619 #46 NEW cov: 11838 ft: 14668 corp: 34/469b lim: 35 exec/s: 46 rss: 69Mb L: 11/32 MS: 1 ChangeByte- 00:07:47.878 [2024-04-25 23:54:37.233927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.233951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.878 #47 NEW cov: 11838 ft: 14682 corp: 35/480b lim: 35 exec/s: 47 rss: 69Mb L: 11/32 MS: 1 ChangeASCIIInt- 00:07:47.878 [2024-04-25 23:54:37.274227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.274251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.878 [2024-04-25 23:54:37.274310] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.274324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.878 [2024-04-25 23:54:37.274382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.274399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.878 #48 NEW cov: 11838 ft: 14717 corp: 36/506b lim: 35 exec/s: 48 rss: 69Mb L: 26/32 MS: 1 InsertRepeatedBytes- 00:07:47.878 [2024-04-25 23:54:37.314124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.314149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.878 #49 NEW cov: 11838 ft: 14722 corp: 37/517b lim: 35 exec/s: 49 rss: 70Mb L: 11/32 MS: 1 ShuffleBytes- 00:07:47.878 [2024-04-25 23:54:37.354205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.354230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.878 #50 NEW cov: 11838 ft: 14727 corp: 38/527b lim: 35 exec/s: 50 rss: 70Mb L: 10/32 MS: 1 ChangeByte- 00:07:47.878 [2024-04-25 23:54:37.394469] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.394495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.878 [2024-04-25 23:54:37.394552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.394566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.878 #51 NEW cov: 11838 ft: 14733 corp: 39/544b lim: 35 exec/s: 51 rss: 70Mb L: 17/32 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:47.878 [2024-04-25 23:54:37.434663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.434688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.878 #52 NEW cov: 11838 ft: 14747 corp: 40/561b lim: 35 exec/s: 52 rss: 70Mb L: 17/32 MS: 1 ChangeByte- 00:07:47.878 [2024-04-25 23:54:37.474748] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.878 [2024-04-25 23:54:37.474773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.137 [2024-04-25 23:54:37.514851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.514876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.137 #54 NEW cov: 11838 ft: 14758 corp: 41/578b lim: 35 exec/s: 54 rss: 70Mb L: 17/32 MS: 2 PersAutoDict-ChangeByte- DE: "\001\000\000\000\002\012)\265"- 00:07:48.137 [2024-04-25 23:54:37.555193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000526 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.555219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.137 [2024-04-25 23:54:37.555275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.555289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.137 [2024-04-25 23:54:37.555347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.555363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.137 [2024-04-25 23:54:37.555419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000001b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.555433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.137 #55 NEW cov: 11838 ft: 14772 corp: 42/607b lim: 35 exec/s: 55 rss: 70Mb L: 29/32 MS: 1 CopyPart- 00:07:48.137 [2024-04-25 23:54:37.594938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.594963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.137 #56 NEW cov: 11838 ft: 14816 corp: 43/618b lim: 35 exec/s: 56 rss: 70Mb L: 11/32 MS: 1 ChangeByte- 00:07:48.137 [2024-04-25 23:54:37.624949] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.624974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.137 #57 NEW cov: 11838 ft: 14875 corp: 44/629b lim: 35 exec/s: 57 rss: 70Mb L: 11/32 MS: 1 ShuffleBytes- 00:07:48.137 [2024-04-25 23:54:37.655032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.137 [2024-04-25 23:54:37.655057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.137 #58 NEW cov: 11838 ft: 14942 corp: 45/640b lim: 35 exec/s: 58 rss: 70Mb L: 11/32 MS: 1 ChangeBinInt- 00:07:48.137 #59 NEW cov: 11838 ft: 15074 corp: 46/657b lim: 35 exec/s: 29 rss: 70Mb L: 17/32 MS: 1 PersAutoDict- DE: "\001\000\000\000\002\012)\265"- 00:07:48.137 #59 DONE cov: 11838 ft: 15074 corp: 46/657b lim: 35 exec/s: 29 rss: 70Mb 00:07:48.137 ###### Recommended dictionary. ###### 00:07:48.137 "\377\377\377\377" # Uses: 2 00:07:48.137 "\001\004\000\000\000\000\000\000" # Uses: 0 00:07:48.137 "\001\037" # Uses: 2 00:07:48.138 "\001\000\000\000\002\012)\265" # Uses: 2 00:07:48.138 ###### End of recommended dictionary. ###### 00:07:48.138 Done 59 runs in 2 second(s) 00:07:48.397 23:54:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:48.397 23:54:37 -- ../common.sh@72 -- # (( i++ )) 00:07:48.397 23:54:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.397 23:54:37 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:48.397 23:54:37 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:48.397 23:54:37 -- nvmf/run.sh@24 -- # local timen=1 00:07:48.397 23:54:37 -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.397 23:54:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:48.397 23:54:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:48.397 23:54:37 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:48.397 23:54:37 -- nvmf/run.sh@29 -- # port=4416 00:07:48.397 23:54:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:48.397 23:54:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:48.397 23:54:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.397 23:54:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:48.397 [2024-04-25 23:54:37.870357] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:48.397 [2024-04-25 23:54:37.870451] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478649 ] 00:07:48.397 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.656 [2024-04-25 23:54:38.050774] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.656 [2024-04-25 23:54:38.069876] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.656 [2024-04-25 23:54:38.069998] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.656 [2024-04-25 23:54:38.121545] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.656 [2024-04-25 23:54:38.137848] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:48.656 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.656 INFO: Seed: 2285706573 00:07:48.656 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:48.656 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:48.656 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:48.656 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.656 #2 INITED exec/s: 0 rss: 59Mb 00:07:48.656 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.656 This may also happen if the target rejected all inputs we tried so far 00:07:48.656 [2024-04-25 23:54:38.193250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.656 [2024-04-25 23:54:38.193282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.656 [2024-04-25 23:54:38.193318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.656 [2024-04-25 23:54:38.193334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.656 [2024-04-25 23:54:38.193386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.656 [2024-04-25 23:54:38.193409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.656 [2024-04-25 23:54:38.193465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.656 [2024-04-25 23:54:38.193480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:48.915 NEW_FUNC[1/664]: 0x4b4010 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:48.915 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.915 #27 NEW cov: 11572 ft: 11573 corp: 2/93b lim: 105 exec/s: 0 rss: 66Mb L: 92/92 MS: 5 ChangeBit-InsertByte-ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:48.915 [2024-04-25 23:54:38.503879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.915 [2024-04-25 23:54:38.503913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.915 [2024-04-25 23:54:38.503967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.915 [2024-04-25 23:54:38.503983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.915 [2024-04-25 23:54:38.504039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.915 [2024-04-25 23:54:38.504055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.174 #30 NEW cov: 11685 ft: 12494 corp: 3/171b lim: 105 exec/s: 0 rss: 67Mb L: 78/92 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:49.174 [2024-04-25 23:54:38.544194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.544223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.174 [2024-04-25 23:54:38.544260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.544275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.174 [2024-04-25 23:54:38.544326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.544340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.174 [2024-04-25 23:54:38.544399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.544414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.174 #31 NEW cov: 11691 ft: 12644 corp: 4/261b lim: 105 exec/s: 0 rss: 67Mb L: 90/92 MS: 1 CrossOver- 00:07:49.174 [2024-04-25 23:54:38.574143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.574170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.174 [2024-04-25 23:54:38.574206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.574221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.174 [2024-04-25 23:54:38.574271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.574286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.174 #32 NEW cov: 11776 ft: 13026 corp: 5/343b lim: 105 exec/s: 0 rss: 67Mb L: 82/92 MS: 1 EraseBytes- 00:07:49.174 [2024-04-25 23:54:38.614429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.174 [2024-04-25 23:54:38.614457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.614503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.614518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.614568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.614582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.614632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597100031 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.614646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.175 #33 NEW cov: 11776 ft: 13127 corp: 6/430b lim: 105 exec/s: 0 rss: 67Mb L: 87/92 MS: 1 InsertRepeatedBytes- 00:07:49.175 [2024-04-25 23:54:38.654413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.654447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.654516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.654531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.654586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.654601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.175 #34 NEW cov: 11776 ft: 13242 corp: 7/508b lim: 105 exec/s: 0 rss: 67Mb L: 78/92 MS: 1 ChangeBinInt- 00:07:49.175 [2024-04-25 23:54:38.694738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.694765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.694810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.694825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.694873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.694887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.694937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.694950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.175 #35 NEW cov: 11776 ft: 13360 corp: 8/600b lim: 105 exec/s: 0 rss: 67Mb L: 92/92 MS: 1 ShuffleBytes- 00:07:49.175 [2024-04-25 23:54:38.734712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.734739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.734778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.734795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.734848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.734863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.175 #36 NEW cov: 11776 ft: 13381 corp: 9/678b lim: 105 exec/s: 0 rss: 67Mb L: 78/92 MS: 1 ChangeBinInt- 00:07:49.175 [2024-04-25 23:54:38.774894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.774921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.774960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.774975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.775025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.775040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.175 [2024-04-25 23:54:38.775091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.175 [2024-04-25 23:54:38.775105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.435 #37 NEW cov: 11776 ft: 13430 corp: 10/768b lim: 105 exec/s: 0 rss: 68Mb L: 90/92 MS: 1 ChangeBinInt- 00:07:49.435 [2024-04-25 23:54:38.815023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.815056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.815096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.815111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.815160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.815175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.815226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597100031 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.815240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.435 #43 NEW cov: 11776 ft: 13519 corp: 11/855b lim: 105 exec/s: 0 rss: 69Mb L: 87/92 MS: 1 ChangeBinInt- 00:07:49.435 [2024-04-25 23:54:38.855136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.855163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.855209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.855223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.855275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.855290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.855343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.855356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.435 #44 NEW cov: 11776 ft: 13591 corp: 12/947b lim: 105 exec/s: 0 rss: 69Mb L: 92/92 MS: 1 CopyPart- 00:07:49.435 [2024-04-25 23:54:38.895328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.895358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.895406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.895420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.895472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.895487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.895538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.895552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.435 #45 NEW cov: 11776 ft: 13719 corp: 13/1046b lim: 105 exec/s: 0 rss: 69Mb L: 99/99 MS: 1 CrossOver- 00:07:49.435 [2024-04-25 23:54:38.935391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.935424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.935464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.935478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.935527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.935543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.935593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597099839 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.935607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.435 #46 NEW cov: 11776 ft: 13767 corp: 14/1133b lim: 105 exec/s: 0 rss: 69Mb L: 87/99 MS: 1 ChangeByte- 00:07:49.435 [2024-04-25 23:54:38.975522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.975551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.975588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:117440512 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.975606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.975659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.975679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:38.975733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:38.975749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.435 #52 NEW cov: 11776 ft: 13774 corp: 15/1225b lim: 105 exec/s: 0 rss: 69Mb L: 92/99 MS: 1 ChangeByte- 00:07:49.435 [2024-04-25 23:54:39.015613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:39.015641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:39.015687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.435 [2024-04-25 23:54:39.015703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.435 [2024-04-25 23:54:39.015756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.436 [2024-04-25 23:54:39.015771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.436 [2024-04-25 23:54:39.015823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597100031 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.436 [2024-04-25 23:54:39.015838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.436 #53 NEW cov: 11776 ft: 13851 corp: 16/1312b lim: 105 exec/s: 0 rss: 69Mb L: 87/99 MS: 1 ShuffleBytes- 00:07:49.695 [2024-04-25 23:54:39.055692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.055720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.055762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.055776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.055827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.055842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.055892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.055907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.695 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.695 #54 NEW cov: 11799 ft: 13860 corp: 17/1402b lim: 105 exec/s: 0 rss: 69Mb L: 90/99 MS: 1 CopyPart- 00:07:49.695 [2024-04-25 23:54:39.095846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.095874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.095921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.095936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.095990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.096003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.096057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:504403158383396615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.096072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.695 #55 NEW cov: 11799 ft: 13884 corp: 18/1504b lim: 105 exec/s: 0 rss: 69Mb L: 102/102 MS: 1 InsertRepeatedBytes- 00:07:49.695 [2024-04-25 23:54:39.125937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.125964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.695 [2024-04-25 23:54:39.126000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.695 [2024-04-25 23:54:39.126012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.126061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.126077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.126125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:504403158383396615 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.126140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.696 #56 NEW cov: 11799 ft: 13901 corp: 19/1606b lim: 105 exec/s: 0 rss: 69Mb L: 102/102 MS: 1 ShuffleBytes- 00:07:49.696 [2024-04-25 23:54:39.166007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.166033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.166078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.166093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.166145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.166160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.166209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.166221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.696 #57 NEW cov: 11799 ft: 13924 corp: 20/1707b lim: 105 exec/s: 57 rss: 69Mb L: 101/102 MS: 1 CrossOver- 00:07:49.696 [2024-04-25 23:54:39.206138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.206165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.206210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.206225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.206280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.206295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.206347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597100031 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.206361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.696 #58 NEW cov: 11799 ft: 13937 corp: 21/1794b lim: 105 exec/s: 58 rss: 70Mb L: 87/102 MS: 1 ShuffleBytes- 00:07:49.696 [2024-04-25 23:54:39.246078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:122318240029184 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.246105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.246151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.246166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.696 #59 NEW cov: 11799 ft: 14285 corp: 22/1847b lim: 105 exec/s: 59 rss: 70Mb L: 53/102 MS: 1 CrossOver- 00:07:49.696 [2024-04-25 23:54:39.286466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.286493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.286541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.286556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.286606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.286621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.696 [2024-04-25 23:54:39.286651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597100031 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.696 [2024-04-25 23:54:39.286666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.696 #60 NEW cov: 11799 ft: 14290 corp: 23/1939b lim: 105 exec/s: 60 rss: 70Mb L: 92/102 MS: 1 InsertRepeatedBytes- 00:07:49.955 [2024-04-25 23:54:39.326262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.955 [2024-04-25 23:54:39.326288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.955 [2024-04-25 23:54:39.326323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677113597099875 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.955 [2024-04-25 23:54:39.326338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.955 #61 NEW cov: 11799 ft: 14334 corp: 24/1988b lim: 105 exec/s: 61 rss: 70Mb L: 49/102 MS: 1 EraseBytes- 00:07:49.955 [2024-04-25 23:54:39.366310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169476096 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.955 [2024-04-25 23:54:39.366338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.955 #62 NEW cov: 11799 ft: 14771 corp: 25/2028b lim: 105 exec/s: 62 rss: 70Mb L: 40/102 MS: 1 CrossOver- 00:07:49.955 [2024-04-25 23:54:39.406718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.955 [2024-04-25 23:54:39.406745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.955 [2024-04-25 23:54:39.406781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.955 [2024-04-25 23:54:39.406796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.406849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.406864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.406915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.406929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.956 #63 NEW cov: 11799 ft: 14790 corp: 26/2118b lim: 105 exec/s: 63 rss: 70Mb L: 90/102 MS: 1 ChangeBit- 00:07:49.956 [2024-04-25 23:54:39.446870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.446897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.446943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25436 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.446958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.447008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.447023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.447075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597100031 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.447090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:49.956 #64 NEW cov: 11799 ft: 14798 corp: 27/2205b lim: 105 exec/s: 64 rss: 70Mb L: 87/102 MS: 1 ChangeByte- 00:07:49.956 [2024-04-25 23:54:39.486847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.486874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.486911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.486926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.486978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.486995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.526962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.526989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.527027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.527041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.956 [2024-04-25 23:54:39.527095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.956 [2024-04-25 23:54:39.527110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.956 #66 NEW cov: 11799 ft: 14817 corp: 28/2287b lim: 105 exec/s: 66 rss: 70Mb L: 82/102 MS: 2 CopyPart-CopyPart- 00:07:50.215 [2024-04-25 23:54:39.567166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.567194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.567233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.567249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.567302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.567318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.215 #67 NEW cov: 11799 ft: 14873 corp: 29/2353b lim: 105 exec/s: 67 rss: 70Mb L: 66/102 MS: 1 EraseBytes- 00:07:50.215 [2024-04-25 23:54:39.607342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.607369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.607412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.607427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.607477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.607493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.607544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:506381179801765639 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.607560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.215 #68 NEW cov: 11799 ft: 14878 corp: 30/2457b lim: 105 exec/s: 68 rss: 70Mb L: 104/104 MS: 1 CopyPart- 00:07:50.215 [2024-04-25 23:54:39.647486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.647513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.647553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.647570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.647621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.647636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.647688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.647703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.215 #69 NEW cov: 11799 ft: 14950 corp: 31/2560b lim: 105 exec/s: 69 rss: 70Mb L: 103/104 MS: 1 CopyPart- 00:07:50.215 [2024-04-25 23:54:39.687480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.687507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.687555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.215 [2024-04-25 23:54:39.687570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.215 [2024-04-25 23:54:39.687620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.687634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.216 #70 NEW cov: 11799 ft: 14962 corp: 32/2626b lim: 105 exec/s: 70 rss: 70Mb L: 66/104 MS: 1 ChangeByte- 00:07:50.216 [2024-04-25 23:54:39.727677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.727704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.727750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.727765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.727820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.727836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.727889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.727903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.216 #71 NEW cov: 11799 ft: 14989 corp: 33/2727b lim: 105 exec/s: 71 rss: 70Mb L: 101/104 MS: 1 ChangeBit- 00:07:50.216 [2024-04-25 23:54:39.767682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.767709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.767745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.767760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.767833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744071082042211 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.767850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.216 #72 NEW cov: 11799 ft: 15001 corp: 34/2802b lim: 105 exec/s: 72 rss: 70Mb L: 75/104 MS: 1 EraseBytes- 00:07:50.216 [2024-04-25 23:54:39.808041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.808068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.808118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.808134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.808184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.808199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.808250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.808264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.216 [2024-04-25 23:54:39.808318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.216 [2024-04-25 23:54:39.808333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:50.475 #73 NEW cov: 11799 ft: 15031 corp: 35/2907b lim: 105 exec/s: 73 rss: 70Mb L: 105/105 MS: 1 CopyPart- 00:07:50.475 [2024-04-25 23:54:39.847819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.847846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.847882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.847895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.475 #74 NEW cov: 11799 ft: 15042 corp: 36/2952b lim: 105 exec/s: 74 rss: 70Mb L: 45/105 MS: 1 CrossOver- 00:07:50.475 [2024-04-25 23:54:39.888210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.888236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.888282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.888297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.888344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.888359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.888418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.888440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.475 #75 NEW cov: 11799 ft: 15054 corp: 37/3051b lim: 105 exec/s: 75 rss: 70Mb L: 99/105 MS: 1 ChangeByte- 00:07:50.475 [2024-04-25 23:54:39.918323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.918352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.918403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.918422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.918474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.918490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.918543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677113597067263 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.918558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.475 #76 NEW cov: 11799 ft: 15133 corp: 38/3138b lim: 105 exec/s: 76 rss: 70Mb L: 87/105 MS: 1 ChangeBit- 00:07:50.475 [2024-04-25 23:54:39.948335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.948362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.948415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.948430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.948480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.948495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.948546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.948561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.475 #77 NEW cov: 11799 ft: 15149 corp: 39/3241b lim: 105 exec/s: 77 rss: 70Mb L: 103/105 MS: 1 ChangeByte- 00:07:50.475 [2024-04-25 23:54:39.988477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.988504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.988542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.988557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.988607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.988626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.475 [2024-04-25 23:54:39.988675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:506373483220382215 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.475 [2024-04-25 23:54:39.988688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.476 #78 NEW cov: 11799 ft: 15186 corp: 40/3344b lim: 105 exec/s: 78 rss: 70Mb L: 103/105 MS: 1 InsertByte- 00:07:50.476 [2024-04-25 23:54:40.028225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169476096 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.476 [2024-04-25 23:54:40.028252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.476 #79 NEW cov: 11799 ft: 15198 corp: 41/3385b lim: 105 exec/s: 79 rss: 70Mb L: 41/105 MS: 1 InsertByte- 00:07:50.476 [2024-04-25 23:54:40.068487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.476 [2024-04-25 23:54:40.068514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.476 [2024-04-25 23:54:40.068568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677113597099875 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.476 [2024-04-25 23:54:40.068584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.735 #80 NEW cov: 11799 ft: 15231 corp: 42/3434b lim: 105 exec/s: 80 rss: 70Mb L: 49/105 MS: 1 CopyPart- 00:07:50.735 [2024-04-25 23:54:40.108872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1866406400 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.108899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.108944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.108959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.109008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.109022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.109073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.109088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.735 #81 NEW cov: 11799 ft: 15237 corp: 43/3536b lim: 105 exec/s: 81 rss: 70Mb L: 102/105 MS: 1 InsertByte- 00:07:50.735 [2024-04-25 23:54:40.139022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.139051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.139088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.139104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.139156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7161849304831976291 len:65380 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.139173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.139224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7161677110969590627 len:25444 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.139238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.735 #82 NEW cov: 11799 ft: 15245 corp: 44/3621b lim: 105 exec/s: 82 rss: 71Mb L: 85/105 MS: 1 CrossOver- 00:07:50.735 [2024-04-25 23:54:40.179177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.179204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.179256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.179272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.179325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.179340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.179399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.179421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.735 [2024-04-25 23:54:40.179493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.735 [2024-04-25 23:54:40.179509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:50.735 #83 NEW cov: 11799 ft: 15255 corp: 45/3726b lim: 105 exec/s: 41 rss: 71Mb L: 105/105 MS: 1 CrossOver- 00:07:50.735 #83 DONE cov: 11799 ft: 15255 corp: 45/3726b lim: 105 exec/s: 41 rss: 71Mb 00:07:50.735 Done 83 runs in 2 second(s) 00:07:50.735 23:54:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:50.735 23:54:40 -- ../common.sh@72 -- # (( i++ )) 00:07:50.735 23:54:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.735 23:54:40 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:50.735 23:54:40 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:50.735 23:54:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.735 23:54:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.735 23:54:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:50.735 23:54:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:50.735 23:54:40 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:50.735 23:54:40 -- nvmf/run.sh@29 -- # port=4417 00:07:50.735 23:54:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:50.735 23:54:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:50.735 23:54:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.735 23:54:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:50.994 [2024-04-25 23:54:40.360761] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:50.994 [2024-04-25 23:54:40.360856] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479185 ] 00:07:50.994 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.994 [2024-04-25 23:54:40.542786] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.994 [2024-04-25 23:54:40.561521] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.994 [2024-04-25 23:54:40.561641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.253 [2024-04-25 23:54:40.612992] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.254 [2024-04-25 23:54:40.629275] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:51.254 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.254 INFO: Seed: 481735041 00:07:51.254 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:51.254 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:51.254 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:51.254 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.254 #2 INITED exec/s: 0 rss: 59Mb 00:07:51.254 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.254 This may also happen if the target rejected all inputs we tried so far 00:07:51.254 [2024-04-25 23:54:40.674858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.254 [2024-04-25 23:54:40.674889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.254 [2024-04-25 23:54:40.674926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.254 [2024-04-25 23:54:40.674941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.254 [2024-04-25 23:54:40.674994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.254 [2024-04-25 23:54:40.675009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.254 [2024-04-25 23:54:40.675062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.254 [2024-04-25 23:54:40.675078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.513 NEW_FUNC[1/664]: 0x4b7300 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:51.513 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.513 #7 NEW cov: 11592 ft: 11593 corp: 2/119b lim: 120 exec/s: 0 rss: 67Mb L: 118/118 MS: 5 InsertByte-ChangeBit-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:51.513 [2024-04-25 23:54:40.985636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:40.985669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:40.985704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:40.985719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:40.985772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:40.985791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:40.985844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:40.985859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.513 NEW_FUNC[1/1]: 0xfa53b0 in _sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1317 00:07:51.513 #8 NEW cov: 11706 ft: 12036 corp: 3/237b lim: 120 exec/s: 0 rss: 68Mb L: 118/118 MS: 1 CopyPart- 00:07:51.513 [2024-04-25 23:54:41.035388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947089112607017 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.035421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:41.035458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.035474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.513 #12 NEW cov: 11712 ft: 12840 corp: 4/299b lim: 120 exec/s: 0 rss: 68Mb L: 62/118 MS: 4 InsertByte-ChangeByte-ChangeBinInt-CrossOver- 00:07:51.513 [2024-04-25 23:54:41.075832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.075860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:41.075897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.075912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:41.075966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:63018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.075983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:41.076038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.076052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.513 #13 NEW cov: 11797 ft: 13125 corp: 5/418b lim: 120 exec/s: 0 rss: 68Mb L: 119/119 MS: 1 InsertByte- 00:07:51.513 [2024-04-25 23:54:41.115645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.115673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.513 [2024-04-25 23:54:41.115710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.513 [2024-04-25 23:54:41.115726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.772 #18 NEW cov: 11797 ft: 13257 corp: 6/488b lim: 120 exec/s: 0 rss: 68Mb L: 70/119 MS: 5 CopyPart-ChangeByte-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:51.772 [2024-04-25 23:54:41.155749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947089112607017 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.155780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.155834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.155849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.772 #19 NEW cov: 11797 ft: 13415 corp: 7/550b lim: 120 exec/s: 0 rss: 68Mb L: 62/119 MS: 1 ChangeByte- 00:07:51.772 [2024-04-25 23:54:41.196225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.196252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.196289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.196305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.196358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.196374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.196433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.196448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.772 #20 NEW cov: 11797 ft: 13464 corp: 8/668b lim: 120 exec/s: 0 rss: 68Mb L: 118/119 MS: 1 ChangeBit- 00:07:51.772 [2024-04-25 23:54:41.236120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.236147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.236183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.236199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.236255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2697472 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.236270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.772 #21 NEW cov: 11797 ft: 13849 corp: 9/740b lim: 120 exec/s: 0 rss: 68Mb L: 72/119 MS: 1 CrossOver- 00:07:51.772 [2024-04-25 23:54:41.276120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947089112607017 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.276149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.276195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10607 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.276211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.772 #22 NEW cov: 11797 ft: 13878 corp: 10/802b lim: 120 exec/s: 0 rss: 68Mb L: 62/119 MS: 1 ChangeByte- 00:07:51.772 [2024-04-25 23:54:41.316727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.316754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.316803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.316819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.316873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.772 [2024-04-25 23:54:41.316888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.772 [2024-04-25 23:54:41.316946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.773 [2024-04-25 23:54:41.316962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.773 [2024-04-25 23:54:41.317017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.773 [2024-04-25 23:54:41.317032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.773 #23 NEW cov: 11797 ft: 14012 corp: 11/922b lim: 120 exec/s: 0 rss: 68Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:07:51.773 [2024-04-25 23:54:41.356688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.773 [2024-04-25 23:54:41.356717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.773 [2024-04-25 23:54:41.356761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.773 [2024-04-25 23:54:41.356777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.773 [2024-04-25 23:54:41.356833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.773 [2024-04-25 23:54:41.356850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.773 [2024-04-25 23:54:41.356904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.773 [2024-04-25 23:54:41.356921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.773 #24 NEW cov: 11797 ft: 14102 corp: 12/1040b lim: 120 exec/s: 0 rss: 68Mb L: 118/120 MS: 1 ShuffleBytes- 00:07:52.032 [2024-04-25 23:54:41.396950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.396978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.397031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.397048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.397100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.397120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.397174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.397190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.397244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.397259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:52.032 #25 NEW cov: 11797 ft: 14130 corp: 13/1160b lim: 120 exec/s: 0 rss: 69Mb L: 120/120 MS: 1 CopyPart- 00:07:52.032 [2024-04-25 23:54:41.436563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.436590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.436630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3602879701896396800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.436645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.032 #26 NEW cov: 11797 ft: 14176 corp: 14/1230b lim: 120 exec/s: 0 rss: 69Mb L: 70/120 MS: 1 ChangeByte- 00:07:52.032 [2024-04-25 23:54:41.476966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.476993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.477042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.477058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.477112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:63018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.477128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.477184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.477200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.032 #27 NEW cov: 11797 ft: 14184 corp: 15/1349b lim: 120 exec/s: 0 rss: 69Mb L: 119/120 MS: 1 CopyPart- 00:07:52.032 [2024-04-25 23:54:41.516780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1845821440 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.516807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.516843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.516859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.032 #28 NEW cov: 11797 ft: 14200 corp: 16/1419b lim: 120 exec/s: 0 rss: 69Mb L: 70/120 MS: 1 CMP- DE: "\005\000"- 00:07:52.032 [2024-04-25 23:54:41.557346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.557373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.557443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.557459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.557514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:63018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.557528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.557585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.557602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.557657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.557673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:52.032 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.032 #29 NEW cov: 11820 ft: 14238 corp: 17/1539b lim: 120 exec/s: 0 rss: 69Mb L: 120/120 MS: 1 CopyPart- 00:07:52.032 [2024-04-25 23:54:41.597325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.597353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.597407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.032 [2024-04-25 23:54:41.597424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.032 [2024-04-25 23:54:41.597478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.033 [2024-04-25 23:54:41.597495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.033 [2024-04-25 23:54:41.597551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.033 [2024-04-25 23:54:41.597567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.033 #30 NEW cov: 11820 ft: 14252 corp: 18/1657b lim: 120 exec/s: 0 rss: 69Mb L: 118/120 MS: 1 ChangeBit- 00:07:52.033 [2024-04-25 23:54:41.637285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.033 [2024-04-25 23:54:41.637312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.033 [2024-04-25 23:54:41.637350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.033 [2024-04-25 23:54:41.637367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.033 [2024-04-25 23:54:41.637427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:63018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.033 [2024-04-25 23:54:41.637445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.366 #31 NEW cov: 11820 ft: 14281 corp: 19/1746b lim: 120 exec/s: 0 rss: 69Mb L: 89/120 MS: 1 EraseBytes- 00:07:52.366 [2024-04-25 23:54:41.677552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.677579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.677629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.677645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.677698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.677716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.677771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143596 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.677787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.366 #32 NEW cov: 11820 ft: 14300 corp: 20/1864b lim: 120 exec/s: 32 rss: 69Mb L: 118/120 MS: 1 ChangeBinInt- 00:07:52.366 [2024-04-25 23:54:41.717206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947089112607017 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.717233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 #38 NEW cov: 11820 ft: 15134 corp: 21/1899b lim: 120 exec/s: 38 rss: 69Mb L: 35/120 MS: 1 CrossOver- 00:07:52.366 [2024-04-25 23:54:41.757809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.757836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.757884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.757900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.757957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.757973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.758031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.758047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.366 #40 NEW cov: 11820 ft: 15170 corp: 22/2011b lim: 120 exec/s: 40 rss: 69Mb L: 112/120 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:52.366 [2024-04-25 23:54:41.787877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.787910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.787944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2677716710209431849 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.787960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.788015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:63018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.788032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.788088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.788104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.366 #41 NEW cov: 11820 ft: 15179 corp: 23/2130b lim: 120 exec/s: 41 rss: 69Mb L: 119/120 MS: 1 ChangeByte- 00:07:52.366 [2024-04-25 23:54:41.827992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.828020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.828071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.828088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.828145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.828162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.828217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:687865856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.828233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.366 #42 NEW cov: 11820 ft: 15205 corp: 24/2240b lim: 120 exec/s: 42 rss: 69Mb L: 110/120 MS: 1 CrossOver- 00:07:52.366 [2024-04-25 23:54:41.867935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.867961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.868000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12948738612705604531 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.868015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.868069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.868086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.366 #43 NEW cov: 11820 ft: 15208 corp: 25/2330b lim: 120 exec/s: 43 rss: 69Mb L: 90/120 MS: 1 CrossOver- 00:07:52.366 [2024-04-25 23:54:41.908106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947089112607017 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.908137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.908200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.908217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.366 [2024-04-25 23:54:41.908274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.366 [2024-04-25 23:54:41.908288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.659 #44 NEW cov: 11820 ft: 15221 corp: 26/2406b lim: 120 exec/s: 44 rss: 70Mb L: 76/120 MS: 1 CrossOver- 00:07:52.659 [2024-04-25 23:54:41.958061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:41.958088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:41.958140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:41.958157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.659 #45 NEW cov: 11820 ft: 15237 corp: 27/2477b lim: 120 exec/s: 45 rss: 70Mb L: 71/120 MS: 1 CrossOver- 00:07:52.659 [2024-04-25 23:54:41.998543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:41.998582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:41.998622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:41.998636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:41.998687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:41.998703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:41.998758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:41.998772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.659 #46 NEW cov: 11820 ft: 15243 corp: 28/2595b lim: 120 exec/s: 46 rss: 70Mb L: 118/120 MS: 1 CrossOver- 00:07:52.659 [2024-04-25 23:54:42.038571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.038598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:42.038650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.038666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:42.038717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.038736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:42.038790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.038807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.659 #47 NEW cov: 11820 ft: 15246 corp: 29/2707b lim: 120 exec/s: 47 rss: 70Mb L: 112/120 MS: 1 ShuffleBytes- 00:07:52.659 [2024-04-25 23:54:42.078581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.078609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:42.078652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.078667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.659 [2024-04-25 23:54:42.078720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.659 [2024-04-25 23:54:42.078736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.659 #48 NEW cov: 11820 ft: 15322 corp: 30/2801b lim: 120 exec/s: 48 rss: 70Mb L: 94/120 MS: 1 EraseBytes- 00:07:52.659 [2024-04-25 23:54:42.118816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.118843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.118893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.118910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.118962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.118978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.119031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:12948890938015724467 len:46004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.119046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.660 #49 NEW cov: 11820 ft: 15328 corp: 31/2898b lim: 120 exec/s: 49 rss: 70Mb L: 97/120 MS: 1 EraseBytes- 00:07:52.660 [2024-04-25 23:54:42.158819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.158847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.158892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.158907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.158963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:63018 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.158981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.660 #50 NEW cov: 11820 ft: 15342 corp: 32/2987b lim: 120 exec/s: 50 rss: 70Mb L: 89/120 MS: 1 ChangeByte- 00:07:52.660 [2024-04-25 23:54:42.199085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.199112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.199163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.199179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.199232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.199249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.199301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.199317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.660 #51 NEW cov: 11820 ft: 15345 corp: 33/3099b lim: 120 exec/s: 51 rss: 70Mb L: 112/120 MS: 1 ChangeBit- 00:07:52.660 [2024-04-25 23:54:42.239071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:45876 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.239098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.239138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12948738612705604531 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.239153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.660 [2024-04-25 23:54:42.239206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.660 [2024-04-25 23:54:42.239223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.972 #52 NEW cov: 11820 ft: 15355 corp: 34/3189b lim: 120 exec/s: 52 rss: 70Mb L: 90/120 MS: 1 ChangeBit- 00:07:52.972 [2024-04-25 23:54:42.279307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.279334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.972 [2024-04-25 23:54:42.279385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.279403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.972 [2024-04-25 23:54:42.279457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.279473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.972 [2024-04-25 23:54:42.279525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.279541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.972 #53 NEW cov: 11820 ft: 15367 corp: 35/3301b lim: 120 exec/s: 53 rss: 70Mb L: 112/120 MS: 1 ChangeByte- 00:07:52.972 [2024-04-25 23:54:42.319444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.319472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.972 [2024-04-25 23:54:42.319521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.319538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.972 [2024-04-25 23:54:42.319590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.319607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.972 [2024-04-25 23:54:42.319661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947180850424105 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.972 [2024-04-25 23:54:42.319679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.973 #54 NEW cov: 11820 ft: 15382 corp: 36/3419b lim: 120 exec/s: 54 rss: 70Mb L: 118/120 MS: 1 ChangeByte- 00:07:52.973 [2024-04-25 23:54:42.359500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.359527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.359577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.359594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.359647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.359664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.359719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.359735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.973 #55 NEW cov: 11820 ft: 15393 corp: 37/3537b lim: 120 exec/s: 55 rss: 70Mb L: 118/120 MS: 1 InsertRepeatedBytes- 00:07:52.973 [2024-04-25 23:54:42.399649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.399678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.399718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.399735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.399789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:17737753864136370473 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.399804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.399857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.399872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.973 #56 NEW cov: 11820 ft: 15406 corp: 38/3650b lim: 120 exec/s: 56 rss: 70Mb L: 113/120 MS: 1 EraseBytes- 00:07:52.973 [2024-04-25 23:54:42.439775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.439804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.439843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.439859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.439912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.439928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.439982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.439997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.973 #57 NEW cov: 11820 ft: 15408 corp: 39/3752b lim: 120 exec/s: 57 rss: 70Mb L: 102/120 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:52.973 [2024-04-25 23:54:42.479814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10535 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.479843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.479882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.479897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.479951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.479968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.480020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.480035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.973 #58 NEW cov: 11820 ft: 15447 corp: 40/3854b lim: 120 exec/s: 58 rss: 70Mb L: 102/120 MS: 1 ChangeBinInt- 00:07:52.973 [2024-04-25 23:54:42.519970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.519999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.520037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.520052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.520105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.520121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.520175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.520191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.973 #59 NEW cov: 11820 ft: 15451 corp: 41/3967b lim: 120 exec/s: 59 rss: 70Mb L: 113/120 MS: 1 InsertByte- 00:07:52.973 [2024-04-25 23:54:42.559805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.559834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.973 [2024-04-25 23:54:42.559887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.973 [2024-04-25 23:54:42.559903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.246 #60 NEW cov: 11820 ft: 15455 corp: 42/4029b lim: 120 exec/s: 60 rss: 70Mb L: 62/120 MS: 1 CrossOver- 00:07:53.246 [2024-04-25 23:54:42.600288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070152781823 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.600317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.600356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.600372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.600431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.600447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.600505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.600521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.246 #61 NEW cov: 11820 ft: 15461 corp: 43/4129b lim: 120 exec/s: 61 rss: 70Mb L: 100/120 MS: 1 EraseBytes- 00:07:53.246 [2024-04-25 23:54:42.630155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12948890935180374963 len:45876 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.630181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.630222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12948738612705604531 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.630240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.630293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2965947086361143593 len:10538 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.630310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.246 #62 NEW cov: 11820 ft: 15473 corp: 44/4219b lim: 120 exec/s: 62 rss: 70Mb L: 90/120 MS: 1 CopyPart- 00:07:53.246 [2024-04-25 23:54:42.670271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.670298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.670336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.670351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.246 [2024-04-25 23:54:42.670416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:61572653852928 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.246 [2024-04-25 23:54:42.670433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.246 #63 NEW cov: 11820 ft: 15482 corp: 45/4292b lim: 120 exec/s: 31 rss: 70Mb L: 73/120 MS: 1 InsertByte- 00:07:53.246 #63 DONE cov: 11820 ft: 15482 corp: 45/4292b lim: 120 exec/s: 31 rss: 70Mb 00:07:53.246 ###### Recommended dictionary. ###### 00:07:53.246 "\005\000" # Uses: 0 00:07:53.246 "\000\000\000\000\000\000\000\000" # Uses: 0 00:07:53.246 ###### End of recommended dictionary. ###### 00:07:53.246 Done 63 runs in 2 second(s) 00:07:53.246 23:54:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:53.246 23:54:42 -- ../common.sh@72 -- # (( i++ )) 00:07:53.246 23:54:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.246 23:54:42 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:53.246 23:54:42 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:53.246 23:54:42 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.246 23:54:42 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.246 23:54:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:53.246 23:54:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:53.246 23:54:42 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:53.246 23:54:42 -- nvmf/run.sh@29 -- # port=4418 00:07:53.246 23:54:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:53.246 23:54:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:53.246 23:54:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.246 23:54:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:53.246 [2024-04-25 23:54:42.844445] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:53.246 [2024-04-25 23:54:42.844528] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479657 ] 00:07:53.503 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.503 [2024-04-25 23:54:43.022762] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.503 [2024-04-25 23:54:43.042057] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.503 [2024-04-25 23:54:43.042179] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.503 [2024-04-25 23:54:43.093557] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.503 [2024-04-25 23:54:43.109859] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:53.761 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.761 INFO: Seed: 2960748119 00:07:53.761 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:53.761 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:53.761 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:53.761 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.761 #2 INITED exec/s: 0 rss: 59Mb 00:07:53.761 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.761 This may also happen if the target rejected all inputs we tried so far 00:07:53.761 [2024-04-25 23:54:43.158770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:53.761 [2024-04-25 23:54:43.158801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.761 [2024-04-25 23:54:43.158850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:53.761 [2024-04-25 23:54:43.158865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.020 NEW_FUNC[1/663]: 0x4bab60 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:54.020 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.020 #4 NEW cov: 11537 ft: 11538 corp: 2/42b lim: 100 exec/s: 0 rss: 66Mb L: 41/41 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:54.020 [2024-04-25 23:54:43.469596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.020 [2024-04-25 23:54:43.469629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.469679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.020 [2024-04-25 23:54:43.469692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.469743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.020 [2024-04-25 23:54:43.469757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.020 #5 NEW cov: 11650 ft: 12379 corp: 3/104b lim: 100 exec/s: 0 rss: 67Mb L: 62/62 MS: 1 CopyPart- 00:07:54.020 [2024-04-25 23:54:43.519749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.020 [2024-04-25 23:54:43.519778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.519811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.020 [2024-04-25 23:54:43.519827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.519876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.020 [2024-04-25 23:54:43.519892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.519943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:54.020 [2024-04-25 23:54:43.519958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.020 #6 NEW cov: 11656 ft: 12824 corp: 4/186b lim: 100 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 CrossOver- 00:07:54.020 [2024-04-25 23:54:43.559641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.020 [2024-04-25 23:54:43.559667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.559716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.020 [2024-04-25 23:54:43.559730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.020 #7 NEW cov: 11741 ft: 13084 corp: 5/242b lim: 100 exec/s: 0 rss: 67Mb L: 56/82 MS: 1 EraseBytes- 00:07:54.020 [2024-04-25 23:54:43.599974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.020 [2024-04-25 23:54:43.600001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.600042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.020 [2024-04-25 23:54:43.600058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.600110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.020 [2024-04-25 23:54:43.600125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.020 [2024-04-25 23:54:43.600178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:54.020 [2024-04-25 23:54:43.600191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.020 #8 NEW cov: 11741 ft: 13214 corp: 6/323b lim: 100 exec/s: 0 rss: 67Mb L: 81/82 MS: 1 CrossOver- 00:07:54.278 [2024-04-25 23:54:43.639976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.640003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.640039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.279 [2024-04-25 23:54:43.640053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.640104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.279 [2024-04-25 23:54:43.640118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.279 #9 NEW cov: 11741 ft: 13249 corp: 7/385b lim: 100 exec/s: 0 rss: 67Mb L: 62/82 MS: 1 ChangeByte- 00:07:54.279 [2024-04-25 23:54:43.680018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.680044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.680094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.279 [2024-04-25 23:54:43.680108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.279 #10 NEW cov: 11741 ft: 13339 corp: 8/426b lim: 100 exec/s: 0 rss: 67Mb L: 41/82 MS: 1 ChangeByte- 00:07:54.279 [2024-04-25 23:54:43.720215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.720241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.720276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.279 [2024-04-25 23:54:43.720291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.720344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.279 [2024-04-25 23:54:43.720360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.279 #11 NEW cov: 11741 ft: 13351 corp: 9/494b lim: 100 exec/s: 0 rss: 67Mb L: 68/82 MS: 1 InsertRepeatedBytes- 00:07:54.279 [2024-04-25 23:54:43.750162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.750187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.279 #12 NEW cov: 11741 ft: 13816 corp: 10/526b lim: 100 exec/s: 0 rss: 68Mb L: 32/82 MS: 1 EraseBytes- 00:07:54.279 [2024-04-25 23:54:43.790318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.790343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.790381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.279 [2024-04-25 23:54:43.790398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.279 #13 NEW cov: 11741 ft: 13903 corp: 11/583b lim: 100 exec/s: 0 rss: 68Mb L: 57/82 MS: 1 InsertByte- 00:07:54.279 [2024-04-25 23:54:43.830553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.830579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.830617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.279 [2024-04-25 23:54:43.830631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.279 [2024-04-25 23:54:43.830685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.279 [2024-04-25 23:54:43.830699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.279 #14 NEW cov: 11741 ft: 13939 corp: 12/658b lim: 100 exec/s: 0 rss: 69Mb L: 75/82 MS: 1 InsertRepeatedBytes- 00:07:54.279 [2024-04-25 23:54:43.870423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.279 [2024-04-25 23:54:43.870449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.537 #15 NEW cov: 11741 ft: 13967 corp: 13/693b lim: 100 exec/s: 0 rss: 69Mb L: 35/82 MS: 1 EraseBytes- 00:07:54.537 [2024-04-25 23:54:43.910770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:43.910796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.910831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.538 [2024-04-25 23:54:43.910845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.910898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.538 [2024-04-25 23:54:43.910913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.538 #17 NEW cov: 11741 ft: 13989 corp: 14/755b lim: 100 exec/s: 0 rss: 69Mb L: 62/82 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:54.538 [2024-04-25 23:54:43.950982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:43.951007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.951046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.538 [2024-04-25 23:54:43.951059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.951109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.538 [2024-04-25 23:54:43.951124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.951175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:54.538 [2024-04-25 23:54:43.951189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.538 #18 NEW cov: 11741 ft: 14061 corp: 15/836b lim: 100 exec/s: 0 rss: 69Mb L: 81/82 MS: 1 ChangeByte- 00:07:54.538 [2024-04-25 23:54:43.991014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:43.991040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.991073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.538 [2024-04-25 23:54:43.991088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:43.991139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.538 [2024-04-25 23:54:43.991153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.538 #19 NEW cov: 11741 ft: 14125 corp: 16/898b lim: 100 exec/s: 0 rss: 69Mb L: 62/82 MS: 1 ChangeBinInt- 00:07:54.538 [2024-04-25 23:54:44.021069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:44.021096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:44.021130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.538 [2024-04-25 23:54:44.021143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:44.021193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.538 [2024-04-25 23:54:44.021209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.538 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.538 #25 NEW cov: 11764 ft: 14176 corp: 17/960b lim: 100 exec/s: 0 rss: 69Mb L: 62/82 MS: 1 ShuffleBytes- 00:07:54.538 [2024-04-25 23:54:44.061241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:44.061268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:44.061302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.538 [2024-04-25 23:54:44.061315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:44.061366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.538 [2024-04-25 23:54:44.061381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.538 #26 NEW cov: 11764 ft: 14207 corp: 18/1035b lim: 100 exec/s: 0 rss: 69Mb L: 75/82 MS: 1 ShuffleBytes- 00:07:54.538 [2024-04-25 23:54:44.101145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:44.101169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 #27 NEW cov: 11764 ft: 14246 corp: 19/1065b lim: 100 exec/s: 0 rss: 69Mb L: 30/82 MS: 1 CrossOver- 00:07:54.538 [2024-04-25 23:54:44.141492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.538 [2024-04-25 23:54:44.141518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:44.141556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.538 [2024-04-25 23:54:44.141571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.538 [2024-04-25 23:54:44.141622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.538 [2024-04-25 23:54:44.141636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.797 #28 NEW cov: 11764 ft: 14285 corp: 20/1126b lim: 100 exec/s: 28 rss: 69Mb L: 61/82 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:54.797 [2024-04-25 23:54:44.181560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.797 [2024-04-25 23:54:44.181586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.797 [2024-04-25 23:54:44.181620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.797 [2024-04-25 23:54:44.181634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.797 [2024-04-25 23:54:44.181685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:54.797 [2024-04-25 23:54:44.181700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.797 #29 NEW cov: 11764 ft: 14324 corp: 21/1197b lim: 100 exec/s: 29 rss: 70Mb L: 71/82 MS: 1 InsertRepeatedBytes- 00:07:54.797 [2024-04-25 23:54:44.221589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.797 [2024-04-25 23:54:44.221615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.797 [2024-04-25 23:54:44.221651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.797 [2024-04-25 23:54:44.221665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.797 #30 NEW cov: 11764 ft: 14330 corp: 22/1244b lim: 100 exec/s: 30 rss: 70Mb L: 47/82 MS: 1 EraseBytes- 00:07:54.797 [2024-04-25 23:54:44.261700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.797 [2024-04-25 23:54:44.261727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.797 [2024-04-25 23:54:44.261763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.797 [2024-04-25 23:54:44.261779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.797 #31 NEW cov: 11764 ft: 14375 corp: 23/1285b lim: 100 exec/s: 31 rss: 70Mb L: 41/82 MS: 1 ChangeBit- 00:07:54.797 [2024-04-25 23:54:44.301806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.797 [2024-04-25 23:54:44.301833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.797 [2024-04-25 23:54:44.301884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.797 [2024-04-25 23:54:44.301904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.797 #32 NEW cov: 11764 ft: 14382 corp: 24/1327b lim: 100 exec/s: 32 rss: 70Mb L: 42/82 MS: 1 InsertByte- 00:07:54.797 [2024-04-25 23:54:44.341806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.797 [2024-04-25 23:54:44.341831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.797 #33 NEW cov: 11764 ft: 14437 corp: 25/1357b lim: 100 exec/s: 33 rss: 70Mb L: 30/82 MS: 1 ChangeByte- 00:07:54.797 [2024-04-25 23:54:44.382010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:54.797 [2024-04-25 23:54:44.382037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.797 [2024-04-25 23:54:44.382070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:54.797 [2024-04-25 23:54:44.382083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.797 #34 NEW cov: 11764 ft: 14516 corp: 26/1414b lim: 100 exec/s: 34 rss: 70Mb L: 57/82 MS: 1 ShuffleBytes- 00:07:55.056 [2024-04-25 23:54:44.422120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.422147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.422186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.056 [2024-04-25 23:54:44.422200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.056 #35 NEW cov: 11764 ft: 14522 corp: 27/1455b lim: 100 exec/s: 35 rss: 70Mb L: 41/82 MS: 1 ChangeBinInt- 00:07:55.056 [2024-04-25 23:54:44.452330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.452355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.452389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.056 [2024-04-25 23:54:44.452414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.452467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.056 [2024-04-25 23:54:44.452481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.056 #36 NEW cov: 11764 ft: 14532 corp: 28/1517b lim: 100 exec/s: 36 rss: 70Mb L: 62/82 MS: 1 CMP- DE: "\377\036"- 00:07:55.056 [2024-04-25 23:54:44.492228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.492254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 #37 NEW cov: 11764 ft: 14554 corp: 29/1552b lim: 100 exec/s: 37 rss: 70Mb L: 35/82 MS: 1 ChangeBit- 00:07:55.056 [2024-04-25 23:54:44.532362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.532389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 #38 NEW cov: 11764 ft: 14569 corp: 30/1582b lim: 100 exec/s: 38 rss: 70Mb L: 30/82 MS: 1 ShuffleBytes- 00:07:55.056 [2024-04-25 23:54:44.572675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.572702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.572741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.056 [2024-04-25 23:54:44.572755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.572804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.056 [2024-04-25 23:54:44.572819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.056 #39 NEW cov: 11764 ft: 14587 corp: 31/1645b lim: 100 exec/s: 39 rss: 70Mb L: 63/82 MS: 1 InsertByte- 00:07:55.056 [2024-04-25 23:54:44.612707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.612732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.612769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.056 [2024-04-25 23:54:44.612782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.056 #40 NEW cov: 11764 ft: 14594 corp: 32/1686b lim: 100 exec/s: 40 rss: 70Mb L: 41/82 MS: 1 CrossOver- 00:07:55.056 [2024-04-25 23:54:44.652807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.056 [2024-04-25 23:54:44.652833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.056 [2024-04-25 23:54:44.652873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.056 [2024-04-25 23:54:44.652888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 #41 NEW cov: 11764 ft: 14646 corp: 33/1727b lim: 100 exec/s: 41 rss: 70Mb L: 41/82 MS: 1 CopyPart- 00:07:55.315 [2024-04-25 23:54:44.693055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.693082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.693123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.315 [2024-04-25 23:54:44.693138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.693189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.315 [2024-04-25 23:54:44.693205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.315 #42 NEW cov: 11764 ft: 14651 corp: 34/1787b lim: 100 exec/s: 42 rss: 70Mb L: 60/82 MS: 1 InsertRepeatedBytes- 00:07:55.315 [2024-04-25 23:54:44.732950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.732978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 #43 NEW cov: 11764 ft: 14676 corp: 35/1822b lim: 100 exec/s: 43 rss: 70Mb L: 35/82 MS: 1 ChangeBit- 00:07:55.315 [2024-04-25 23:54:44.773273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.773298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.773333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.315 [2024-04-25 23:54:44.773347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.773402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.315 [2024-04-25 23:54:44.773420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.315 #44 NEW cov: 11764 ft: 14683 corp: 36/1884b lim: 100 exec/s: 44 rss: 70Mb L: 62/82 MS: 1 InsertByte- 00:07:55.315 [2024-04-25 23:54:44.813558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.813584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.813619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.315 [2024-04-25 23:54:44.813634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.813684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.315 [2024-04-25 23:54:44.813699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.813752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:55.315 [2024-04-25 23:54:44.813767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.315 #45 NEW cov: 11764 ft: 14696 corp: 37/1966b lim: 100 exec/s: 45 rss: 70Mb L: 82/82 MS: 1 ChangeBinInt- 00:07:55.315 [2024-04-25 23:54:44.853532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.853557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.853591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.315 [2024-04-25 23:54:44.853605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.853655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.315 [2024-04-25 23:54:44.853670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.315 #46 NEW cov: 11764 ft: 14699 corp: 38/2028b lim: 100 exec/s: 46 rss: 70Mb L: 62/82 MS: 1 ShuffleBytes- 00:07:55.315 [2024-04-25 23:54:44.893740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.893767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.893808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.315 [2024-04-25 23:54:44.893823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.893872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.315 [2024-04-25 23:54:44.893887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.893938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:55.315 [2024-04-25 23:54:44.893951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.315 #47 NEW cov: 11764 ft: 14709 corp: 39/2114b lim: 100 exec/s: 47 rss: 70Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:07:55.315 [2024-04-25 23:54:44.923824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.315 [2024-04-25 23:54:44.923850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.923888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.315 [2024-04-25 23:54:44.923903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.315 [2024-04-25 23:54:44.923956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.315 [2024-04-25 23:54:44.923970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.574 #48 NEW cov: 11764 ft: 14725 corp: 40/2185b lim: 100 exec/s: 48 rss: 70Mb L: 71/86 MS: 1 CopyPart- 00:07:55.574 [2024-04-25 23:54:44.963858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.574 [2024-04-25 23:54:44.963884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:44.963923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.574 [2024-04-25 23:54:44.963938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:44.963991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.574 [2024-04-25 23:54:44.964005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.574 #49 NEW cov: 11764 ft: 14727 corp: 41/2248b lim: 100 exec/s: 49 rss: 70Mb L: 63/86 MS: 1 InsertByte- 00:07:55.574 [2024-04-25 23:54:44.993991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.574 [2024-04-25 23:54:44.994017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:44.994050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.574 [2024-04-25 23:54:44.994064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:44.994117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.574 [2024-04-25 23:54:44.994132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.574 #50 NEW cov: 11764 ft: 14738 corp: 42/2310b lim: 100 exec/s: 50 rss: 70Mb L: 62/86 MS: 1 ChangeBit- 00:07:55.574 [2024-04-25 23:54:45.034098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.574 [2024-04-25 23:54:45.034124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:45.034156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.574 [2024-04-25 23:54:45.034169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:45.034220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.574 [2024-04-25 23:54:45.034236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.574 #51 NEW cov: 11764 ft: 14740 corp: 43/2371b lim: 100 exec/s: 51 rss: 70Mb L: 61/86 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:55.574 [2024-04-25 23:54:45.074334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.574 [2024-04-25 23:54:45.074359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:45.074411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.574 [2024-04-25 23:54:45.074426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.574 [2024-04-25 23:54:45.074480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.575 [2024-04-25 23:54:45.074495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.074546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:55.575 [2024-04-25 23:54:45.074560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.575 #52 NEW cov: 11764 ft: 14744 corp: 44/2454b lim: 100 exec/s: 52 rss: 70Mb L: 83/86 MS: 1 InsertByte- 00:07:55.575 [2024-04-25 23:54:45.114446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.575 [2024-04-25 23:54:45.114472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.114520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.575 [2024-04-25 23:54:45.114534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.114585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.575 [2024-04-25 23:54:45.114599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.114650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:55.575 [2024-04-25 23:54:45.114663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.575 #53 NEW cov: 11764 ft: 14754 corp: 45/2542b lim: 100 exec/s: 53 rss: 71Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:07:55.575 [2024-04-25 23:54:45.154547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:55.575 [2024-04-25 23:54:45.154573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.154620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:55.575 [2024-04-25 23:54:45.154634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.154687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:55.575 [2024-04-25 23:54:45.154701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.575 [2024-04-25 23:54:45.154754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:55.575 [2024-04-25 23:54:45.154768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.575 #54 NEW cov: 11764 ft: 14763 corp: 46/2637b lim: 100 exec/s: 27 rss: 71Mb L: 95/95 MS: 1 InsertRepeatedBytes- 00:07:55.575 #54 DONE cov: 11764 ft: 14763 corp: 46/2637b lim: 100 exec/s: 27 rss: 71Mb 00:07:55.575 ###### Recommended dictionary. ###### 00:07:55.575 "\377\377\377\377" # Uses: 1 00:07:55.575 "\377\036" # Uses: 0 00:07:55.575 ###### End of recommended dictionary. ###### 00:07:55.575 Done 54 runs in 2 second(s) 00:07:55.834 23:54:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:55.834 23:54:45 -- ../common.sh@72 -- # (( i++ )) 00:07:55.834 23:54:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.834 23:54:45 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:55.834 23:54:45 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:55.834 23:54:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.834 23:54:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.834 23:54:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:55.834 23:54:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:55.834 23:54:45 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:55.834 23:54:45 -- nvmf/run.sh@29 -- # port=4419 00:07:55.834 23:54:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:55.834 23:54:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:55.834 23:54:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.834 23:54:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:55.834 [2024-04-25 23:54:45.336858] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:55.834 [2024-04-25 23:54:45.336951] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480025 ] 00:07:55.834 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.093 [2024-04-25 23:54:45.509188] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.093 [2024-04-25 23:54:45.528089] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.093 [2024-04-25 23:54:45.528210] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.093 [2024-04-25 23:54:45.579740] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.093 [2024-04-25 23:54:45.596041] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:56.093 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.093 INFO: Seed: 1152793545 00:07:56.093 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:56.093 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:56.093 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:56.093 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.093 #2 INITED exec/s: 0 rss: 59Mb 00:07:56.093 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.093 This may also happen if the target rejected all inputs we tried so far 00:07:56.093 [2024-04-25 23:54:45.641017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:20304 00:07:56.093 [2024-04-25 23:54:45.641050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.351 NEW_FUNC[1/663]: 0x4bdb20 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:56.351 NEW_FUNC[2/663]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.351 #7 NEW cov: 11515 ft: 11513 corp: 2/20b lim: 50 exec/s: 0 rss: 67Mb L: 19/19 MS: 5 ChangeBit-ShuffleBytes-CrossOver-InsertByte-InsertRepeatedBytes- 00:07:56.351 [2024-04-25 23:54:45.951815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:56.351 [2024-04-25 23:54:45.951848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 #8 NEW cov: 11628 ft: 11896 corp: 3/39b lim: 50 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ChangeByte- 00:07:56.610 [2024-04-25 23:54:45.991901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:20304 00:07:56.610 [2024-04-25 23:54:45.991929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 #9 NEW cov: 11634 ft: 12198 corp: 4/58b lim: 50 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:56.610 [2024-04-25 23:54:46.032094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:56.610 [2024-04-25 23:54:46.032123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 [2024-04-25 23:54:46.032165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5693957644512874319 len:20304 00:07:56.610 [2024-04-25 23:54:46.032181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.610 #10 NEW cov: 11719 ft: 12809 corp: 5/79b lim: 50 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 CMP- DE: "\005\000"- 00:07:56.610 [2024-04-25 23:54:46.072170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:9552 00:07:56.610 [2024-04-25 23:54:46.072197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 [2024-04-25 23:54:46.072247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5693957644512874319 len:20304 00:07:56.610 [2024-04-25 23:54:46.072262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.610 #11 NEW cov: 11719 ft: 12863 corp: 6/100b lim: 50 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeByte- 00:07:56.610 [2024-04-25 23:54:46.112285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:56.610 [2024-04-25 23:54:46.112312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 [2024-04-25 23:54:46.112345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5693957644512874319 len:20304 00:07:56.610 [2024-04-25 23:54:46.112360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.610 #12 NEW cov: 11719 ft: 13115 corp: 7/121b lim: 50 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ShuffleBytes- 00:07:56.610 [2024-04-25 23:54:46.152338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:44209 00:07:56.610 [2024-04-25 23:54:46.152364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 #13 NEW cov: 11719 ft: 13227 corp: 8/140b lim: 50 exec/s: 0 rss: 67Mb L: 19/21 MS: 1 ChangeBinInt- 00:07:56.610 [2024-04-25 23:54:46.192468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917243 len:20304 00:07:56.610 [2024-04-25 23:54:46.192493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.610 #14 NEW cov: 11719 ft: 13350 corp: 9/159b lim: 50 exec/s: 0 rss: 67Mb L: 19/21 MS: 1 ChangeByte- 00:07:56.869 [2024-04-25 23:54:46.222686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:56.869 [2024-04-25 23:54:46.222713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.222763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5705784751790444367 len:20304 00:07:56.869 [2024-04-25 23:54:46.222780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.869 #15 NEW cov: 11719 ft: 13410 corp: 10/181b lim: 50 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 InsertByte- 00:07:56.869 [2024-04-25 23:54:46.263092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:56.869 [2024-04-25 23:54:46.263118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.263157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:56.869 [2024-04-25 23:54:46.263174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.263227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:56.869 [2024-04-25 23:54:46.263243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.263293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:56.869 [2024-04-25 23:54:46.263308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.263361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:56.869 [2024-04-25 23:54:46.263376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:56.869 #16 NEW cov: 11719 ft: 13761 corp: 11/231b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:07:56.869 [2024-04-25 23:54:46.302850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:20304 00:07:56.869 [2024-04-25 23:54:46.302876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.302926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:2566 00:07:56.869 [2024-04-25 23:54:46.302942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.869 #17 NEW cov: 11719 ft: 13825 corp: 12/252b lim: 50 exec/s: 0 rss: 68Mb L: 21/50 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:56.869 [2024-04-25 23:54:46.343327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:56.869 [2024-04-25 23:54:46.343353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.343402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:56.869 [2024-04-25 23:54:46.343417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.343467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:56.869 [2024-04-25 23:54:46.343482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.343533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1407374883553280 len:1 00:07:56.869 [2024-04-25 23:54:46.343548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.343599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:56.869 [2024-04-25 23:54:46.343615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:56.869 #18 NEW cov: 11719 ft: 13896 corp: 13/302b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:56.869 [2024-04-25 23:54:46.383221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:659488768 len:1 00:07:56.869 [2024-04-25 23:54:46.383250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.383284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:56.869 [2024-04-25 23:54:46.383300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.869 [2024-04-25 23:54:46.383352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:56.869 [2024-04-25 23:54:46.383366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.869 #19 NEW cov: 11719 ft: 14121 corp: 14/337b lim: 50 exec/s: 0 rss: 68Mb L: 35/50 MS: 1 EraseBytes- 00:07:56.869 [2024-04-25 23:54:46.423124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12732889067472728496 len:20304 00:07:56.869 [2024-04-25 23:54:46.423150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 #20 NEW cov: 11719 ft: 14143 corp: 15/356b lim: 50 exec/s: 0 rss: 68Mb L: 19/50 MS: 1 ChangeBinInt- 00:07:56.869 [2024-04-25 23:54:46.453200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714799805866856251 len:20304 00:07:56.869 [2024-04-25 23:54:46.453225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.869 #21 NEW cov: 11719 ft: 14170 corp: 16/375b lim: 50 exec/s: 0 rss: 68Mb L: 19/50 MS: 1 ChangeByte- 00:07:57.129 [2024-04-25 23:54:46.493323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.129 [2024-04-25 23:54:46.493350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 #22 NEW cov: 11719 ft: 14190 corp: 17/394b lim: 50 exec/s: 0 rss: 68Mb L: 19/50 MS: 1 ChangeBit- 00:07:57.129 [2024-04-25 23:54:46.523760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917243 len:20304 00:07:57.129 [2024-04-25 23:54:46.523785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.523822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5715067924168789839 len:65536 00:07:57.129 [2024-04-25 23:54:46.523837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.523889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:57.129 [2024-04-25 23:54:46.523904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.523956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:07:57.129 [2024-04-25 23:54:46.523972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.129 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.129 #23 NEW cov: 11742 ft: 14248 corp: 18/438b lim: 50 exec/s: 0 rss: 68Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:07:57.129 [2024-04-25 23:54:46.563664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12731874398142147249 len:9552 00:07:57.129 [2024-04-25 23:54:46.563691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.563739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20235 00:07:57.129 [2024-04-25 23:54:46.563759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.129 #24 NEW cov: 11742 ft: 14278 corp: 19/458b lim: 50 exec/s: 0 rss: 68Mb L: 20/50 MS: 1 CrossOver- 00:07:57.129 [2024-04-25 23:54:46.603997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:31355 00:07:57.129 [2024-04-25 23:54:46.604022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.604069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.129 [2024-04-25 23:54:46.604085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.604136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8825501086245354106 len:31355 00:07:57.129 [2024-04-25 23:54:46.604151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.604202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5714873654932306554 len:20304 00:07:57.129 [2024-04-25 23:54:46.604217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.129 #25 NEW cov: 11742 ft: 14286 corp: 20/503b lim: 50 exec/s: 0 rss: 68Mb L: 45/50 MS: 1 InsertRepeatedBytes- 00:07:57.129 [2024-04-25 23:54:46.644033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3048850842395814439 len:45237 00:07:57.129 [2024-04-25 23:54:46.644059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.644103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5709244154671091535 len:3110 00:07:57.129 [2024-04-25 23:54:46.644119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.644171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:5714873654208057167 len:20304 00:07:57.129 [2024-04-25 23:54:46.644187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.129 #26 NEW cov: 11742 ft: 14306 corp: 21/542b lim: 50 exec/s: 26 rss: 68Mb L: 39/50 MS: 1 CrossOver- 00:07:57.129 [2024-04-25 23:54:46.684224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:31355 00:07:57.129 [2024-04-25 23:54:46.684249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.684296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.129 [2024-04-25 23:54:46.684312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.684362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8825501086245354106 len:31355 00:07:57.129 [2024-04-25 23:54:46.684378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.684434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5714873654932306554 len:39504 00:07:57.129 [2024-04-25 23:54:46.684451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.129 #27 NEW cov: 11742 ft: 14341 corp: 22/587b lim: 50 exec/s: 27 rss: 68Mb L: 45/50 MS: 1 ChangeByte- 00:07:57.129 [2024-04-25 23:54:46.724460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.129 [2024-04-25 23:54:46.724487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.724538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:86862743994368 len:1 00:07:57.129 [2024-04-25 23:54:46.724554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.724608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.129 [2024-04-25 23:54:46.724624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.724678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:57.129 [2024-04-25 23:54:46.724694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.129 [2024-04-25 23:54:46.724746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:57.129 [2024-04-25 23:54:46.724762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:57.388 #28 NEW cov: 11742 ft: 14390 corp: 23/637b lim: 50 exec/s: 28 rss: 68Mb L: 50/50 MS: 1 CrossOver- 00:07:57.388 [2024-04-25 23:54:46.764477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:31355 00:07:57.388 [2024-04-25 23:54:46.764503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.764550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.388 [2024-04-25 23:54:46.764566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.764617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8825501086245354106 len:31355 00:07:57.388 [2024-04-25 23:54:46.764634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.764686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:8825500901561760378 len:20304 00:07:57.388 [2024-04-25 23:54:46.764701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.388 #29 NEW cov: 11742 ft: 14436 corp: 24/685b lim: 50 exec/s: 29 rss: 68Mb L: 48/50 MS: 1 CopyPart- 00:07:57.388 [2024-04-25 23:54:46.804611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653546864463 len:31355 00:07:57.388 [2024-04-25 23:54:46.804637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.804682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.388 [2024-04-25 23:54:46.804698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.804748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8825501086245354106 len:31355 00:07:57.388 [2024-04-25 23:54:46.804764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.804818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5714873654932306554 len:39504 00:07:57.388 [2024-04-25 23:54:46.804834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.388 #30 NEW cov: 11742 ft: 14475 corp: 25/730b lim: 50 exec/s: 30 rss: 69Mb L: 45/50 MS: 1 ChangeByte- 00:07:57.388 [2024-04-25 23:54:46.844704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3182967604304948268 len:11309 00:07:57.388 [2024-04-25 23:54:46.844730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.844779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3182967604875373612 len:11309 00:07:57.388 [2024-04-25 23:54:46.844794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.388 [2024-04-25 23:54:46.844847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3182967604875373612 len:11309 00:07:57.389 [2024-04-25 23:54:46.844863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.844915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3182967604875373612 len:11309 00:07:57.389 [2024-04-25 23:54:46.844930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.389 #31 NEW cov: 11742 ft: 14508 corp: 26/774b lim: 50 exec/s: 31 rss: 69Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:07:57.389 [2024-04-25 23:54:46.884535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5695451880141508431 len:20269 00:07:57.389 [2024-04-25 23:54:46.884561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.389 #32 NEW cov: 11742 ft: 14542 corp: 27/785b lim: 50 exec/s: 32 rss: 69Mb L: 11/50 MS: 1 CrossOver- 00:07:57.389 [2024-04-25 23:54:46.924929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.389 [2024-04-25 23:54:46.924956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.924999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:57.389 [2024-04-25 23:54:46.925015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.925067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.389 [2024-04-25 23:54:46.925083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.925135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:57.389 [2024-04-25 23:54:46.925150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.389 #33 NEW cov: 11742 ft: 14564 corp: 28/833b lim: 50 exec/s: 33 rss: 69Mb L: 48/50 MS: 1 EraseBytes- 00:07:57.389 [2024-04-25 23:54:46.965012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:31355 00:07:57.389 [2024-04-25 23:54:46.965040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.965082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.389 [2024-04-25 23:54:46.965101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.965152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8825483494059309690 len:31355 00:07:57.389 [2024-04-25 23:54:46.965168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.389 [2024-04-25 23:54:46.965220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5714873654932306554 len:20304 00:07:57.389 [2024-04-25 23:54:46.965235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.389 #34 NEW cov: 11742 ft: 14575 corp: 29/878b lim: 50 exec/s: 34 rss: 69Mb L: 45/50 MS: 1 ChangeBit- 00:07:57.648 [2024-04-25 23:54:47.005256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.648 [2024-04-25 23:54:47.005282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.005329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:57.648 [2024-04-25 23:54:47.005345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.005400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.648 [2024-04-25 23:54:47.005416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.005466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:57.648 [2024-04-25 23:54:47.005481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.005532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:57.648 [2024-04-25 23:54:47.005546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:57.648 #35 NEW cov: 11742 ft: 14606 corp: 30/928b lim: 50 exec/s: 35 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:57.648 [2024-04-25 23:54:47.045407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145897216 len:20304 00:07:57.648 [2024-04-25 23:54:47.045434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.045487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:57.648 [2024-04-25 23:54:47.045504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.045557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.648 [2024-04-25 23:54:47.045574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.045626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1407374883553280 len:1 00:07:57.648 [2024-04-25 23:54:47.045641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.045694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:57.648 [2024-04-25 23:54:47.045710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:57.648 #36 NEW cov: 11742 ft: 14631 corp: 31/978b lim: 50 exec/s: 36 rss: 69Mb L: 50/50 MS: 1 CMP- DE: "\001\000"- 00:07:57.648 [2024-04-25 23:54:47.085497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.648 [2024-04-25 23:54:47.085525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.085573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:57.648 [2024-04-25 23:54:47.085589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.085642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.648 [2024-04-25 23:54:47.085658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.085711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1407374883553280 len:1 00:07:57.648 [2024-04-25 23:54:47.085728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.085779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:57.648 [2024-04-25 23:54:47.085796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:57.648 #37 NEW cov: 11742 ft: 14647 corp: 32/1028b lim: 50 exec/s: 37 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:57.648 [2024-04-25 23:54:47.125185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5695451880141508431 len:45524 00:07:57.648 [2024-04-25 23:54:47.125213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 #43 NEW cov: 11742 ft: 14661 corp: 33/1039b lim: 50 exec/s: 43 rss: 69Mb L: 11/50 MS: 1 ChangeBinInt- 00:07:57.648 [2024-04-25 23:54:47.165536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3048850842395814439 len:45237 00:07:57.648 [2024-04-25 23:54:47.165562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.165597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5709244154671091535 len:3110 00:07:57.648 [2024-04-25 23:54:47.165611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.165671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:5714873654208057167 len:20304 00:07:57.648 [2024-04-25 23:54:47.165686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.648 #44 NEW cov: 11742 ft: 14674 corp: 34/1078b lim: 50 exec/s: 44 rss: 69Mb L: 39/50 MS: 1 ChangeBit- 00:07:57.648 [2024-04-25 23:54:47.205759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917243 len:20304 00:07:57.648 [2024-04-25 23:54:47.205786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.205832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5715067924168789839 len:65536 00:07:57.648 [2024-04-25 23:54:47.205847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.205900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:57.648 [2024-04-25 23:54:47.205917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.205967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:43264 00:07:57.648 [2024-04-25 23:54:47.205982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.648 #45 NEW cov: 11742 ft: 14695 corp: 35/1122b lim: 50 exec/s: 45 rss: 69Mb L: 44/50 MS: 1 ChangeByte- 00:07:57.648 [2024-04-25 23:54:47.245873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.648 [2024-04-25 23:54:47.245900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.245937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:57.648 [2024-04-25 23:54:47.245953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.246006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.648 [2024-04-25 23:54:47.246021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.648 [2024-04-25 23:54:47.246073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:07:57.648 [2024-04-25 23:54:47.246090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.907 #46 NEW cov: 11742 ft: 14705 corp: 36/1170b lim: 50 exec/s: 46 rss: 69Mb L: 48/50 MS: 1 ChangeByte- 00:07:57.907 [2024-04-25 23:54:47.285910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:31355 00:07:57.907 [2024-04-25 23:54:47.285937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.285977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.907 [2024-04-25 23:54:47.285993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.286044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8825501086245354106 len:31312 00:07:57.907 [2024-04-25 23:54:47.286060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.907 #47 NEW cov: 11742 ft: 14729 corp: 37/1201b lim: 50 exec/s: 47 rss: 69Mb L: 31/50 MS: 1 EraseBytes- 00:07:57.907 [2024-04-25 23:54:47.326004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:57.907 [2024-04-25 23:54:47.326031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.326069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:57.907 [2024-04-25 23:54:47.326085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.326136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.907 [2024-04-25 23:54:47.326152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.907 #48 NEW cov: 11742 ft: 14747 corp: 38/1237b lim: 50 exec/s: 48 rss: 69Mb L: 36/50 MS: 1 EraseBytes- 00:07:57.907 [2024-04-25 23:54:47.366376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:20225 00:07:57.907 [2024-04-25 23:54:47.366407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.366441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:57.907 [2024-04-25 23:54:47.366453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.366506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.907 [2024-04-25 23:54:47.366521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.366572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1407374883553280 len:1 00:07:57.907 [2024-04-25 23:54:47.366587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.366638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:57.907 [2024-04-25 23:54:47.366652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:57.907 #49 NEW cov: 11742 ft: 14784 corp: 39/1287b lim: 50 exec/s: 49 rss: 70Mb L: 50/50 MS: 1 CopyPart- 00:07:57.907 [2024-04-25 23:54:47.406422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:20225 00:07:57.907 [2024-04-25 23:54:47.406448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.406500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:57.907 [2024-04-25 23:54:47.406514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.406566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:57.907 [2024-04-25 23:54:47.406581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.406632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5497558138880 len:1 00:07:57.907 [2024-04-25 23:54:47.406648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.406682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:57.907 [2024-04-25 23:54:47.406698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:57.907 #50 NEW cov: 11742 ft: 14794 corp: 40/1337b lim: 50 exec/s: 50 rss: 70Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:57.907 [2024-04-25 23:54:47.446231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:31355 00:07:57.907 [2024-04-25 23:54:47.446258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.446300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8825501086245354106 len:31355 00:07:57.907 [2024-04-25 23:54:47.446316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.907 #51 NEW cov: 11742 ft: 14808 corp: 41/1361b lim: 50 exec/s: 51 rss: 70Mb L: 24/50 MS: 1 EraseBytes- 00:07:57.907 [2024-04-25 23:54:47.486358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873653534543695 len:20230 00:07:57.907 [2024-04-25 23:54:47.486385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.907 [2024-04-25 23:54:47.486440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873652882657103 len:20304 00:07:57.907 [2024-04-25 23:54:47.486456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.907 #52 NEW cov: 11742 ft: 14819 corp: 42/1384b lim: 50 exec/s: 52 rss: 70Mb L: 23/50 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:58.167 [2024-04-25 23:54:47.526594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12732889067472728496 len:20304 00:07:58.167 [2024-04-25 23:54:47.526621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.526655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714797787903109711 len:20402 00:07:58.167 [2024-04-25 23:54:47.526669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.526722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:5714873656422649679 len:20235 00:07:58.167 [2024-04-25 23:54:47.526737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.167 #53 NEW cov: 11742 ft: 14829 corp: 43/1414b lim: 50 exec/s: 53 rss: 70Mb L: 30/50 MS: 1 CrossOver- 00:07:58.167 [2024-04-25 23:54:47.566474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12732889067472728496 len:20304 00:07:58.167 [2024-04-25 23:54:47.566500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.167 #54 NEW cov: 11742 ft: 14837 corp: 44/1427b lim: 50 exec/s: 54 rss: 70Mb L: 13/50 MS: 1 EraseBytes- 00:07:58.167 [2024-04-25 23:54:47.606942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:58.167 [2024-04-25 23:54:47.606968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.607018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5714873314905640783 len:1 00:07:58.167 [2024-04-25 23:54:47.607034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.607085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:4 00:07:58.167 [2024-04-25 23:54:47.607102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.607157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:1407374883553280 len:1 00:07:58.167 [2024-04-25 23:54:47.607173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.607225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:20235 00:07:58.167 [2024-04-25 23:54:47.607239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:58.167 #55 NEW cov: 11742 ft: 14847 corp: 45/1477b lim: 50 exec/s: 55 rss: 70Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:58.167 [2024-04-25 23:54:47.646755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5714873473145917263 len:20304 00:07:58.167 [2024-04-25 23:54:47.646784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.167 [2024-04-25 23:54:47.646835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5705784751790444367 len:20304 00:07:58.167 [2024-04-25 23:54:47.646850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.167 #56 NEW cov: 11742 ft: 14865 corp: 46/1499b lim: 50 exec/s: 28 rss: 70Mb L: 22/50 MS: 1 ShuffleBytes- 00:07:58.167 #56 DONE cov: 11742 ft: 14865 corp: 46/1499b lim: 50 exec/s: 28 rss: 70Mb 00:07:58.167 ###### Recommended dictionary. ###### 00:07:58.167 "\005\000" # Uses: 3 00:07:58.167 "\001\000" # Uses: 0 00:07:58.167 ###### End of recommended dictionary. ###### 00:07:58.167 Done 56 runs in 2 second(s) 00:07:58.426 23:54:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:58.426 23:54:47 -- ../common.sh@72 -- # (( i++ )) 00:07:58.426 23:54:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.426 23:54:47 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:58.426 23:54:47 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:58.426 23:54:47 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.426 23:54:47 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.426 23:54:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:58.426 23:54:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:58.426 23:54:47 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:58.426 23:54:47 -- nvmf/run.sh@29 -- # port=4420 00:07:58.426 23:54:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:58.426 23:54:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:58.426 23:54:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.426 23:54:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:58.426 [2024-04-25 23:54:47.827598] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:07:58.426 [2024-04-25 23:54:47.827682] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480556 ] 00:07:58.426 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.426 [2024-04-25 23:54:48.002670] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.426 [2024-04-25 23:54:48.021686] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.426 [2024-04-25 23:54:48.021819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.685 [2024-04-25 23:54:48.073447] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.685 [2024-04-25 23:54:48.089728] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:58.685 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.685 INFO: Seed: 3646776221 00:07:58.685 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:07:58.685 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:07:58.685 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:58.685 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.685 #2 INITED exec/s: 0 rss: 59Mb 00:07:58.685 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.685 This may also happen if the target rejected all inputs we tried so far 00:07:58.685 [2024-04-25 23:54:48.134857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:58.685 [2024-04-25 23:54:48.134889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.944 NEW_FUNC[1/665]: 0x4bf6e0 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:58.944 NEW_FUNC[2/665]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.944 #5 NEW cov: 11573 ft: 11574 corp: 2/25b lim: 90 exec/s: 0 rss: 67Mb L: 24/24 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:58.944 [2024-04-25 23:54:48.445850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:58.944 [2024-04-25 23:54:48.445884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.944 [2024-04-25 23:54:48.445936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:58.944 [2024-04-25 23:54:48.445951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.944 [2024-04-25 23:54:48.446005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:58.944 [2024-04-25 23:54:48.446020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.944 #6 NEW cov: 11686 ft: 12982 corp: 3/91b lim: 90 exec/s: 0 rss: 67Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:07:58.944 [2024-04-25 23:54:48.485898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:58.944 [2024-04-25 23:54:48.485925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.944 [2024-04-25 23:54:48.485968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:58.944 [2024-04-25 23:54:48.485984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.944 [2024-04-25 23:54:48.486036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:58.944 [2024-04-25 23:54:48.486051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.944 #7 NEW cov: 11692 ft: 13090 corp: 4/158b lim: 90 exec/s: 0 rss: 67Mb L: 67/67 MS: 1 InsertByte- 00:07:58.944 [2024-04-25 23:54:48.525660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:58.944 [2024-04-25 23:54:48.525686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.944 #13 NEW cov: 11777 ft: 13416 corp: 5/183b lim: 90 exec/s: 0 rss: 67Mb L: 25/67 MS: 1 CrossOver- 00:07:59.203 [2024-04-25 23:54:48.566124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.566151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.203 [2024-04-25 23:54:48.566188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.203 [2024-04-25 23:54:48.566203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.203 [2024-04-25 23:54:48.566258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.203 [2024-04-25 23:54:48.566272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.203 #14 NEW cov: 11777 ft: 13477 corp: 6/250b lim: 90 exec/s: 0 rss: 67Mb L: 67/67 MS: 1 ChangeASCIIInt- 00:07:59.203 [2024-04-25 23:54:48.605921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.605950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.203 #15 NEW cov: 11777 ft: 13512 corp: 7/274b lim: 90 exec/s: 0 rss: 67Mb L: 24/67 MS: 1 ShuffleBytes- 00:07:59.203 [2024-04-25 23:54:48.646372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.646404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.203 [2024-04-25 23:54:48.646457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.203 [2024-04-25 23:54:48.646474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.203 [2024-04-25 23:54:48.646529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.203 [2024-04-25 23:54:48.646545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.203 #16 NEW cov: 11777 ft: 13574 corp: 8/341b lim: 90 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 ChangeBinInt- 00:07:59.203 [2024-04-25 23:54:48.686160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.686186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.203 #17 NEW cov: 11777 ft: 13641 corp: 9/365b lim: 90 exec/s: 0 rss: 68Mb L: 24/67 MS: 1 ChangeBinInt- 00:07:59.203 [2024-04-25 23:54:48.726341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.726368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.203 #18 NEW cov: 11777 ft: 13697 corp: 10/389b lim: 90 exec/s: 0 rss: 68Mb L: 24/67 MS: 1 CopyPart- 00:07:59.203 [2024-04-25 23:54:48.766350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.766376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.203 #19 NEW cov: 11777 ft: 13741 corp: 11/414b lim: 90 exec/s: 0 rss: 68Mb L: 25/67 MS: 1 InsertByte- 00:07:59.203 [2024-04-25 23:54:48.796454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.203 [2024-04-25 23:54:48.796480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 #20 NEW cov: 11777 ft: 13786 corp: 12/439b lim: 90 exec/s: 0 rss: 68Mb L: 25/67 MS: 1 InsertByte- 00:07:59.462 [2024-04-25 23:54:48.836576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:48.836603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 #21 NEW cov: 11777 ft: 13870 corp: 13/464b lim: 90 exec/s: 0 rss: 68Mb L: 25/67 MS: 1 ChangeBinInt- 00:07:59.462 [2024-04-25 23:54:48.876735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:48.876761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 #22 NEW cov: 11777 ft: 13911 corp: 14/496b lim: 90 exec/s: 0 rss: 68Mb L: 32/67 MS: 1 CMP- DE: "\001\000\000\000\000\000\000?"- 00:07:59.462 [2024-04-25 23:54:48.906832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:48.906859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 #23 NEW cov: 11777 ft: 13939 corp: 15/528b lim: 90 exec/s: 0 rss: 68Mb L: 32/67 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000?"- 00:07:59.462 [2024-04-25 23:54:48.936897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:48.936922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 #24 NEW cov: 11777 ft: 13996 corp: 16/552b lim: 90 exec/s: 0 rss: 68Mb L: 24/67 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:59.462 [2024-04-25 23:54:48.977285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:48.977313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-04-25 23:54:48.977351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.462 [2024-04-25 23:54:48.977367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 [2024-04-25 23:54:48.977426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.462 [2024-04-25 23:54:48.977442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.462 #25 NEW cov: 11777 ft: 14026 corp: 17/619b lim: 90 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 ChangeByte- 00:07:59.462 [2024-04-25 23:54:49.017269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:49.017295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-04-25 23:54:49.017350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.462 [2024-04-25 23:54:49.017365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.462 #26 NEW cov: 11800 ft: 14356 corp: 18/670b lim: 90 exec/s: 0 rss: 68Mb L: 51/67 MS: 1 EraseBytes- 00:07:59.462 [2024-04-25 23:54:49.067564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.462 [2024-04-25 23:54:49.067591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.462 [2024-04-25 23:54:49.067629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.462 [2024-04-25 23:54:49.067645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.462 [2024-04-25 23:54:49.067696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.462 [2024-04-25 23:54:49.067712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.720 #27 NEW cov: 11800 ft: 14375 corp: 19/737b lim: 90 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 CrossOver- 00:07:59.720 [2024-04-25 23:54:49.107312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.720 [2024-04-25 23:54:49.107338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 #28 NEW cov: 11800 ft: 14381 corp: 20/763b lim: 90 exec/s: 28 rss: 68Mb L: 26/67 MS: 1 InsertByte- 00:07:59.720 [2024-04-25 23:54:49.147640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.720 [2024-04-25 23:54:49.147667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 [2024-04-25 23:54:49.147709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.720 [2024-04-25 23:54:49.147724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.720 #29 NEW cov: 11800 ft: 14406 corp: 21/805b lim: 90 exec/s: 29 rss: 69Mb L: 42/67 MS: 1 InsertRepeatedBytes- 00:07:59.720 [2024-04-25 23:54:49.187598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.720 [2024-04-25 23:54:49.187625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 #40 NEW cov: 11800 ft: 14414 corp: 22/829b lim: 90 exec/s: 40 rss: 69Mb L: 24/67 MS: 1 ChangeBinInt- 00:07:59.720 [2024-04-25 23:54:49.227736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.720 [2024-04-25 23:54:49.227762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 #41 NEW cov: 11800 ft: 14419 corp: 23/853b lim: 90 exec/s: 41 rss: 69Mb L: 24/67 MS: 1 ChangeByte- 00:07:59.720 [2024-04-25 23:54:49.258084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.720 [2024-04-25 23:54:49.258111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.720 [2024-04-25 23:54:49.258158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.720 [2024-04-25 23:54:49.258175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.720 [2024-04-25 23:54:49.258230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.720 [2024-04-25 23:54:49.258246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.720 #42 NEW cov: 11800 ft: 14438 corp: 24/920b lim: 90 exec/s: 42 rss: 69Mb L: 67/67 MS: 1 InsertByte- 00:07:59.721 [2024-04-25 23:54:49.297931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.721 [2024-04-25 23:54:49.297957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.721 #43 NEW cov: 11800 ft: 14526 corp: 25/944b lim: 90 exec/s: 43 rss: 69Mb L: 24/67 MS: 1 CrossOver- 00:07:59.979 [2024-04-25 23:54:49.338030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.338056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 #44 NEW cov: 11800 ft: 14665 corp: 26/968b lim: 90 exec/s: 44 rss: 69Mb L: 24/67 MS: 1 ChangeBit- 00:07:59.979 [2024-04-25 23:54:49.378425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.378451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.378487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.979 [2024-04-25 23:54:49.378502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.378554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.979 [2024-04-25 23:54:49.378569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.979 #45 NEW cov: 11800 ft: 14748 corp: 27/1035b lim: 90 exec/s: 45 rss: 69Mb L: 67/67 MS: 1 ChangeBit- 00:07:59.979 [2024-04-25 23:54:49.418228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.418255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 #46 NEW cov: 11800 ft: 14764 corp: 28/1059b lim: 90 exec/s: 46 rss: 69Mb L: 24/67 MS: 1 ShuffleBytes- 00:07:59.979 [2024-04-25 23:54:49.448457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.448483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.448522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.979 [2024-04-25 23:54:49.448538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.979 #47 NEW cov: 11800 ft: 14784 corp: 29/1101b lim: 90 exec/s: 47 rss: 69Mb L: 42/67 MS: 1 ChangeBinInt- 00:07:59.979 [2024-04-25 23:54:49.488747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.488773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.488816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.979 [2024-04-25 23:54:49.488832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.488890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.979 [2024-04-25 23:54:49.488905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.979 #48 NEW cov: 11800 ft: 14804 corp: 30/1168b lim: 90 exec/s: 48 rss: 69Mb L: 67/67 MS: 1 ChangeByte- 00:07:59.979 [2024-04-25 23:54:49.528714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.528742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.528782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.979 [2024-04-25 23:54:49.528797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.979 #52 NEW cov: 11800 ft: 14815 corp: 31/1213b lim: 90 exec/s: 52 rss: 70Mb L: 45/67 MS: 4 EraseBytes-EraseBytes-CopyPart-InsertRepeatedBytes- 00:07:59.979 [2024-04-25 23:54:49.569141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:59.979 [2024-04-25 23:54:49.569167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.569215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:59.979 [2024-04-25 23:54:49.569230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.569285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:59.979 [2024-04-25 23:54:49.569300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.979 [2024-04-25 23:54:49.569355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:59.979 [2024-04-25 23:54:49.569370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.238 #53 NEW cov: 11800 ft: 15158 corp: 32/1293b lim: 90 exec/s: 53 rss: 70Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:08:00.238 [2024-04-25 23:54:49.608804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.238 [2024-04-25 23:54:49.608830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.238 #54 NEW cov: 11800 ft: 15167 corp: 33/1325b lim: 90 exec/s: 54 rss: 70Mb L: 32/80 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000?"- 00:08:00.238 [2024-04-25 23:54:49.649378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.238 [2024-04-25 23:54:49.649409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.649448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.238 [2024-04-25 23:54:49.649465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.649519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.238 [2024-04-25 23:54:49.649535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.649588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:00.238 [2024-04-25 23:54:49.649603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.238 #55 NEW cov: 11800 ft: 15178 corp: 34/1413b lim: 90 exec/s: 55 rss: 70Mb L: 88/88 MS: 1 CrossOver- 00:08:00.238 [2024-04-25 23:54:49.689169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.238 [2024-04-25 23:54:49.689195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.689236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.238 [2024-04-25 23:54:49.689252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.238 #56 NEW cov: 11800 ft: 15187 corp: 35/1464b lim: 90 exec/s: 56 rss: 70Mb L: 51/88 MS: 1 InsertRepeatedBytes- 00:08:00.238 [2024-04-25 23:54:49.729723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.238 [2024-04-25 23:54:49.729749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.729802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.238 [2024-04-25 23:54:49.729817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.729871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.238 [2024-04-25 23:54:49.729885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.729940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:00.238 [2024-04-25 23:54:49.729955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.730007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:00.238 [2024-04-25 23:54:49.730024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:00.238 #57 NEW cov: 11800 ft: 15231 corp: 36/1554b lim: 90 exec/s: 57 rss: 70Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:00.238 [2024-04-25 23:54:49.769705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.238 [2024-04-25 23:54:49.769732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.769773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.238 [2024-04-25 23:54:49.769792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.769845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.238 [2024-04-25 23:54:49.769862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.238 [2024-04-25 23:54:49.769917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:00.238 [2024-04-25 23:54:49.769932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.238 #58 NEW cov: 11800 ft: 15279 corp: 37/1636b lim: 90 exec/s: 58 rss: 70Mb L: 82/90 MS: 1 CopyPart- 00:08:00.238 [2024-04-25 23:54:49.809411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.238 [2024-04-25 23:54:49.809436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.238 #59 NEW cov: 11800 ft: 15307 corp: 38/1660b lim: 90 exec/s: 59 rss: 70Mb L: 24/90 MS: 1 ChangeBit- 00:08:00.498 [2024-04-25 23:54:49.849850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:49.849877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.849912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.498 [2024-04-25 23:54:49.849928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.849983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.498 [2024-04-25 23:54:49.849997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.498 #60 NEW cov: 11800 ft: 15314 corp: 39/1727b lim: 90 exec/s: 60 rss: 70Mb L: 67/90 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:00.498 [2024-04-25 23:54:49.889957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:49.889983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.890021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.498 [2024-04-25 23:54:49.890037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.890092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.498 [2024-04-25 23:54:49.890108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.498 #61 NEW cov: 11800 ft: 15324 corp: 40/1794b lim: 90 exec/s: 61 rss: 70Mb L: 67/90 MS: 1 ChangeASCIIInt- 00:08:00.498 [2024-04-25 23:54:49.929784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:49.929810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 #62 NEW cov: 11800 ft: 15326 corp: 41/1818b lim: 90 exec/s: 62 rss: 70Mb L: 24/90 MS: 1 ShuffleBytes- 00:08:00.498 [2024-04-25 23:54:49.970313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:49.970340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.970377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.498 [2024-04-25 23:54:49.970399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.970454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:00.498 [2024-04-25 23:54:49.970471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:49.970527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:00.498 [2024-04-25 23:54:49.970544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.498 #63 NEW cov: 11800 ft: 15331 corp: 42/1907b lim: 90 exec/s: 63 rss: 70Mb L: 89/90 MS: 1 InsertRepeatedBytes- 00:08:00.498 [2024-04-25 23:54:50.009942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:50.009969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 #64 NEW cov: 11800 ft: 15335 corp: 43/1937b lim: 90 exec/s: 64 rss: 70Mb L: 30/90 MS: 1 CrossOver- 00:08:00.498 [2024-04-25 23:54:50.050088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:50.050115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 #65 NEW cov: 11800 ft: 15347 corp: 44/1962b lim: 90 exec/s: 65 rss: 70Mb L: 25/90 MS: 1 ShuffleBytes- 00:08:00.498 [2024-04-25 23:54:50.090312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.498 [2024-04-25 23:54:50.090339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.498 [2024-04-25 23:54:50.090399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:00.498 [2024-04-25 23:54:50.090416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.757 #66 NEW cov: 11800 ft: 15372 corp: 45/2005b lim: 90 exec/s: 66 rss: 70Mb L: 43/90 MS: 1 InsertByte- 00:08:00.757 [2024-04-25 23:54:50.130319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:00.757 [2024-04-25 23:54:50.130348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.757 #67 NEW cov: 11800 ft: 15375 corp: 46/2029b lim: 90 exec/s: 33 rss: 70Mb L: 24/90 MS: 1 ShuffleBytes- 00:08:00.757 #67 DONE cov: 11800 ft: 15375 corp: 46/2029b lim: 90 exec/s: 33 rss: 70Mb 00:08:00.757 ###### Recommended dictionary. ###### 00:08:00.757 "\001\000\000\000\000\000\000?" # Uses: 2 00:08:00.757 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:00.757 ###### End of recommended dictionary. ###### 00:08:00.757 Done 67 runs in 2 second(s) 00:08:00.757 23:54:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:00.757 23:54:50 -- ../common.sh@72 -- # (( i++ )) 00:08:00.757 23:54:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.757 23:54:50 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:00.757 23:54:50 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:00.757 23:54:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.758 23:54:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.758 23:54:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:00.758 23:54:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:00.758 23:54:50 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:00.758 23:54:50 -- nvmf/run.sh@29 -- # port=4421 00:08:00.758 23:54:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:00.758 23:54:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:00.758 23:54:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.758 23:54:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:00.758 [2024-04-25 23:54:50.307409] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:00.758 [2024-04-25 23:54:50.307505] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480896 ] 00:08:00.758 EAL: No free 2048 kB hugepages reported on node 1 00:08:01.017 [2024-04-25 23:54:50.494323] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.017 [2024-04-25 23:54:50.513775] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:01.017 [2024-04-25 23:54:50.513903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.017 [2024-04-25 23:54:50.565873] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.017 [2024-04-25 23:54:50.582188] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:01.017 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.017 INFO: Seed: 1844832631 00:08:01.017 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:01.017 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:01.017 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:01.017 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.017 #2 INITED exec/s: 0 rss: 60Mb 00:08:01.017 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.017 This may also happen if the target rejected all inputs we tried so far 00:08:01.275 [2024-04-25 23:54:50.649067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.275 [2024-04-25 23:54:50.649102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.275 [2024-04-25 23:54:50.649229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.275 [2024-04-25 23:54:50.649254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.275 [2024-04-25 23:54:50.649377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.276 [2024-04-25 23:54:50.649407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.276 [2024-04-25 23:54:50.649556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.276 [2024-04-25 23:54:50.649581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.535 NEW_FUNC[1/664]: 0x4c2900 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:01.535 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.535 #5 NEW cov: 11539 ft: 11540 corp: 2/42b lim: 50 exec/s: 0 rss: 67Mb L: 41/41 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:01.535 [2024-04-25 23:54:50.989551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.535 [2024-04-25 23:54:50.989587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:50.989738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.535 [2024-04-25 23:54:50.989768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:50.989897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.535 [2024-04-25 23:54:50.989922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:50.990046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.535 [2024-04-25 23:54:50.990069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.535 NEW_FUNC[1/1]: 0x172e630 in nvme_get_transport /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:55 00:08:01.535 #6 NEW cov: 11661 ft: 11995 corp: 3/83b lim: 50 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 CMP- DE: "\264)\012\002\000\000\000\000"- 00:08:01.535 [2024-04-25 23:54:51.049788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.535 [2024-04-25 23:54:51.049823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.049949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.535 [2024-04-25 23:54:51.049974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.050096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.535 [2024-04-25 23:54:51.050119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.050239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.535 [2024-04-25 23:54:51.050264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.535 #7 NEW cov: 11667 ft: 12157 corp: 4/124b lim: 50 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 ShuffleBytes- 00:08:01.535 [2024-04-25 23:54:51.089645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.535 [2024-04-25 23:54:51.089674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.089770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.535 [2024-04-25 23:54:51.089792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.089920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.535 [2024-04-25 23:54:51.089939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.535 #11 NEW cov: 11752 ft: 12851 corp: 5/159b lim: 50 exec/s: 0 rss: 67Mb L: 35/41 MS: 4 PersAutoDict-ChangeBinInt-ChangeByte-InsertRepeatedBytes- DE: "\264)\012\002\000\000\000\000"- 00:08:01.535 [2024-04-25 23:54:51.129578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.535 [2024-04-25 23:54:51.129608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.129689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.535 [2024-04-25 23:54:51.129711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.129840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.535 [2024-04-25 23:54:51.129866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.535 [2024-04-25 23:54:51.129983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.535 [2024-04-25 23:54:51.130007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.794 #14 NEW cov: 11752 ft: 13024 corp: 6/204b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 3 InsertByte-CopyPart-CrossOver- 00:08:01.794 [2024-04-25 23:54:51.169409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.794 [2024-04-25 23:54:51.169436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.169527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.794 [2024-04-25 23:54:51.169548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.169663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.794 [2024-04-25 23:54:51.169690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.794 #15 NEW cov: 11752 ft: 13167 corp: 7/241b lim: 50 exec/s: 0 rss: 68Mb L: 37/45 MS: 1 EraseBytes- 00:08:01.794 [2024-04-25 23:54:51.210133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.794 [2024-04-25 23:54:51.210168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.210257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.794 [2024-04-25 23:54:51.210284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.210419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.794 [2024-04-25 23:54:51.210445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.210579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.794 [2024-04-25 23:54:51.210601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.794 #16 NEW cov: 11752 ft: 13267 corp: 8/282b lim: 50 exec/s: 0 rss: 68Mb L: 41/45 MS: 1 PersAutoDict- DE: "\264)\012\002\000\000\000\000"- 00:08:01.794 [2024-04-25 23:54:51.250294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.794 [2024-04-25 23:54:51.250325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.250426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.794 [2024-04-25 23:54:51.250448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.250564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.794 [2024-04-25 23:54:51.250583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.250720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.794 [2024-04-25 23:54:51.250744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.794 #17 NEW cov: 11752 ft: 13298 corp: 9/323b lim: 50 exec/s: 0 rss: 68Mb L: 41/45 MS: 1 ShuffleBytes- 00:08:01.794 [2024-04-25 23:54:51.290369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.794 [2024-04-25 23:54:51.290401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.290492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.794 [2024-04-25 23:54:51.290515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.290644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.794 [2024-04-25 23:54:51.290667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.290786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.794 [2024-04-25 23:54:51.290811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.794 #18 NEW cov: 11752 ft: 13357 corp: 10/368b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:01.794 [2024-04-25 23:54:51.330518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.794 [2024-04-25 23:54:51.330559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.330648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.794 [2024-04-25 23:54:51.330669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.330791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.794 [2024-04-25 23:54:51.330816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.330931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.794 [2024-04-25 23:54:51.330953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.794 #19 NEW cov: 11752 ft: 13405 corp: 11/409b lim: 50 exec/s: 0 rss: 68Mb L: 41/45 MS: 1 ChangeByte- 00:08:01.794 [2024-04-25 23:54:51.370666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:01.794 [2024-04-25 23:54:51.370697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.370783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:01.794 [2024-04-25 23:54:51.370806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.794 [2024-04-25 23:54:51.370921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:01.795 [2024-04-25 23:54:51.370943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.795 [2024-04-25 23:54:51.371056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:01.795 [2024-04-25 23:54:51.371080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.795 #20 NEW cov: 11752 ft: 13481 corp: 12/454b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 ChangeByte- 00:08:02.053 [2024-04-25 23:54:51.410580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.053 [2024-04-25 23:54:51.410610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.410740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.053 [2024-04-25 23:54:51.410764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.410877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.053 [2024-04-25 23:54:51.410901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.411015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.053 [2024-04-25 23:54:51.411035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.053 #21 NEW cov: 11752 ft: 13512 corp: 13/496b lim: 50 exec/s: 0 rss: 68Mb L: 42/45 MS: 1 CrossOver- 00:08:02.053 [2024-04-25 23:54:51.450480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.053 [2024-04-25 23:54:51.450505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.450630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.053 [2024-04-25 23:54:51.450654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.053 #26 NEW cov: 11752 ft: 13826 corp: 14/520b lim: 50 exec/s: 0 rss: 68Mb L: 24/45 MS: 5 PersAutoDict-ShuffleBytes-EraseBytes-ChangeBinInt-InsertRepeatedBytes- DE: "\264)\012\002\000\000\000\000"- 00:08:02.053 [2024-04-25 23:54:51.491086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.053 [2024-04-25 23:54:51.491114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.491216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.053 [2024-04-25 23:54:51.491239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.491363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.053 [2024-04-25 23:54:51.491385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.491494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.053 [2024-04-25 23:54:51.491518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.053 #27 NEW cov: 11752 ft: 13832 corp: 15/567b lim: 50 exec/s: 0 rss: 68Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:02.053 [2024-04-25 23:54:51.531224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.053 [2024-04-25 23:54:51.531256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.531382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.053 [2024-04-25 23:54:51.531405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.531518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.053 [2024-04-25 23:54:51.531539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.531661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.053 [2024-04-25 23:54:51.531687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.053 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:02.053 #28 NEW cov: 11775 ft: 13977 corp: 16/608b lim: 50 exec/s: 0 rss: 68Mb L: 41/47 MS: 1 ShuffleBytes- 00:08:02.053 [2024-04-25 23:54:51.570882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.053 [2024-04-25 23:54:51.570911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.571005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.053 [2024-04-25 23:54:51.571024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.053 [2024-04-25 23:54:51.571146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.054 [2024-04-25 23:54:51.571164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.054 #29 NEW cov: 11775 ft: 13983 corp: 17/645b lim: 50 exec/s: 0 rss: 68Mb L: 37/47 MS: 1 CrossOver- 00:08:02.054 [2024-04-25 23:54:51.611374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.054 [2024-04-25 23:54:51.611411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.054 [2024-04-25 23:54:51.611535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.054 [2024-04-25 23:54:51.611557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.054 [2024-04-25 23:54:51.611678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.054 [2024-04-25 23:54:51.611700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.054 [2024-04-25 23:54:51.611823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.054 [2024-04-25 23:54:51.611848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.054 #30 NEW cov: 11775 ft: 13998 corp: 18/686b lim: 50 exec/s: 30 rss: 68Mb L: 41/47 MS: 1 ChangeByte- 00:08:02.054 [2024-04-25 23:54:51.651585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.054 [2024-04-25 23:54:51.651612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.054 [2024-04-25 23:54:51.651695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.054 [2024-04-25 23:54:51.651714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.054 [2024-04-25 23:54:51.651833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.054 [2024-04-25 23:54:51.651856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.054 [2024-04-25 23:54:51.651976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.054 [2024-04-25 23:54:51.651998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.312 #31 NEW cov: 11775 ft: 14032 corp: 19/735b lim: 50 exec/s: 31 rss: 68Mb L: 49/49 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:02.312 [2024-04-25 23:54:51.690823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.312 [2024-04-25 23:54:51.690859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.312 [2024-04-25 23:54:51.690981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.312 [2024-04-25 23:54:51.691005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.312 #32 NEW cov: 11775 ft: 14062 corp: 20/758b lim: 50 exec/s: 32 rss: 68Mb L: 23/49 MS: 1 EraseBytes- 00:08:02.312 [2024-04-25 23:54:51.741809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.312 [2024-04-25 23:54:51.741839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.312 [2024-04-25 23:54:51.741934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.312 [2024-04-25 23:54:51.741955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.312 [2024-04-25 23:54:51.742075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.312 [2024-04-25 23:54:51.742097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.312 [2024-04-25 23:54:51.742219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.312 [2024-04-25 23:54:51.742240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.312 #33 NEW cov: 11775 ft: 14168 corp: 21/802b lim: 50 exec/s: 33 rss: 68Mb L: 44/49 MS: 1 InsertRepeatedBytes- 00:08:02.312 [2024-04-25 23:54:51.781678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.312 [2024-04-25 23:54:51.781708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.781846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.313 [2024-04-25 23:54:51.781870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.782001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.313 [2024-04-25 23:54:51.782024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.313 #34 NEW cov: 11775 ft: 14188 corp: 22/833b lim: 50 exec/s: 34 rss: 69Mb L: 31/49 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:02.313 [2024-04-25 23:54:51.822011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.313 [2024-04-25 23:54:51.822042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.822121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.313 [2024-04-25 23:54:51.822143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.822269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.313 [2024-04-25 23:54:51.822293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.822427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.313 [2024-04-25 23:54:51.822449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.313 #35 NEW cov: 11775 ft: 14204 corp: 23/875b lim: 50 exec/s: 35 rss: 69Mb L: 42/49 MS: 1 InsertByte- 00:08:02.313 [2024-04-25 23:54:51.861719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.313 [2024-04-25 23:54:51.861754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.861858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.313 [2024-04-25 23:54:51.861881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.861999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.313 [2024-04-25 23:54:51.862025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.862136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.313 [2024-04-25 23:54:51.862159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.313 #36 NEW cov: 11775 ft: 14263 corp: 24/916b lim: 50 exec/s: 36 rss: 69Mb L: 41/49 MS: 1 ChangeBit- 00:08:02.313 [2024-04-25 23:54:51.901902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.313 [2024-04-25 23:54:51.901935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.902031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.313 [2024-04-25 23:54:51.902051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.902167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.313 [2024-04-25 23:54:51.902191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.313 [2024-04-25 23:54:51.902312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.313 [2024-04-25 23:54:51.902336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.572 #37 NEW cov: 11775 ft: 14268 corp: 25/957b lim: 50 exec/s: 37 rss: 69Mb L: 41/49 MS: 1 ShuffleBytes- 00:08:02.572 [2024-04-25 23:54:51.951875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.572 [2024-04-25 23:54:51.951907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:51.952030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.572 [2024-04-25 23:54:51.952051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.572 #38 NEW cov: 11775 ft: 14284 corp: 26/983b lim: 50 exec/s: 38 rss: 69Mb L: 26/49 MS: 1 EraseBytes- 00:08:02.572 [2024-04-25 23:54:52.012061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.572 [2024-04-25 23:54:52.012097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.012190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.572 [2024-04-25 23:54:52.012213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.572 #39 NEW cov: 11775 ft: 14354 corp: 27/1009b lim: 50 exec/s: 39 rss: 69Mb L: 26/49 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:02.572 [2024-04-25 23:54:52.072808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.572 [2024-04-25 23:54:52.072840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.072927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.572 [2024-04-25 23:54:52.072951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.073073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.572 [2024-04-25 23:54:52.073097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.073227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.572 [2024-04-25 23:54:52.073249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.572 #40 NEW cov: 11775 ft: 14425 corp: 28/1058b lim: 50 exec/s: 40 rss: 69Mb L: 49/49 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:02.572 [2024-04-25 23:54:52.122975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.572 [2024-04-25 23:54:52.123004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.123094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.572 [2024-04-25 23:54:52.123116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.123242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.572 [2024-04-25 23:54:52.123262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.123385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.572 [2024-04-25 23:54:52.123409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.572 #41 NEW cov: 11775 ft: 14488 corp: 29/1099b lim: 50 exec/s: 41 rss: 69Mb L: 41/49 MS: 1 ShuffleBytes- 00:08:02.572 [2024-04-25 23:54:52.172879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.572 [2024-04-25 23:54:52.172910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.173004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.572 [2024-04-25 23:54:52.173025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.572 [2024-04-25 23:54:52.173142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.572 [2024-04-25 23:54:52.173168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.831 #42 NEW cov: 11775 ft: 14505 corp: 30/1130b lim: 50 exec/s: 42 rss: 69Mb L: 31/49 MS: 1 ShuffleBytes- 00:08:02.831 [2024-04-25 23:54:52.213019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.831 [2024-04-25 23:54:52.213047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.213156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.831 [2024-04-25 23:54:52.213179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.213308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.831 [2024-04-25 23:54:52.213329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.831 #43 NEW cov: 11775 ft: 14511 corp: 31/1167b lim: 50 exec/s: 43 rss: 69Mb L: 37/49 MS: 1 ChangeBinInt- 00:08:02.831 [2024-04-25 23:54:52.253062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.831 [2024-04-25 23:54:52.253092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.253204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.831 [2024-04-25 23:54:52.253224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.253353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.831 [2024-04-25 23:54:52.253376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.831 #44 NEW cov: 11775 ft: 14527 corp: 32/1199b lim: 50 exec/s: 44 rss: 69Mb L: 32/49 MS: 1 InsertByte- 00:08:02.831 [2024-04-25 23:54:52.293175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.831 [2024-04-25 23:54:52.293206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.293333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.831 [2024-04-25 23:54:52.293355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.293473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.831 [2024-04-25 23:54:52.293491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.831 #45 NEW cov: 11775 ft: 14533 corp: 33/1237b lim: 50 exec/s: 45 rss: 69Mb L: 38/49 MS: 1 InsertByte- 00:08:02.831 [2024-04-25 23:54:52.333580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.831 [2024-04-25 23:54:52.333610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.831 [2024-04-25 23:54:52.333693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.832 [2024-04-25 23:54:52.333716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.333838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.832 [2024-04-25 23:54:52.333873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.333997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.832 [2024-04-25 23:54:52.334024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.832 #46 NEW cov: 11775 ft: 14548 corp: 34/1278b lim: 50 exec/s: 46 rss: 69Mb L: 41/49 MS: 1 CMP- DE: "\000\000\000\011"- 00:08:02.832 [2024-04-25 23:54:52.383416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.832 [2024-04-25 23:54:52.383446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.383543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.832 [2024-04-25 23:54:52.383566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.383691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.832 [2024-04-25 23:54:52.383716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.832 #47 NEW cov: 11775 ft: 14566 corp: 35/1309b lim: 50 exec/s: 47 rss: 69Mb L: 31/49 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:02.832 [2024-04-25 23:54:52.423644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:02.832 [2024-04-25 23:54:52.423673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.423777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:02.832 [2024-04-25 23:54:52.423797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.423924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:02.832 [2024-04-25 23:54:52.423948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.832 [2024-04-25 23:54:52.424072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:02.832 [2024-04-25 23:54:52.424088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.091 #48 NEW cov: 11775 ft: 14576 corp: 36/1356b lim: 50 exec/s: 48 rss: 69Mb L: 47/49 MS: 1 ShuffleBytes- 00:08:03.091 [2024-04-25 23:54:52.463921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.091 [2024-04-25 23:54:52.463951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.091 [2024-04-25 23:54:52.464036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.091 [2024-04-25 23:54:52.464057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.091 [2024-04-25 23:54:52.464178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.091 [2024-04-25 23:54:52.464203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.091 [2024-04-25 23:54:52.464303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.091 [2024-04-25 23:54:52.464324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.091 #49 NEW cov: 11775 ft: 14602 corp: 37/1397b lim: 50 exec/s: 49 rss: 70Mb L: 41/49 MS: 1 ShuffleBytes- 00:08:03.091 [2024-04-25 23:54:52.514066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.091 [2024-04-25 23:54:52.514096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.091 [2024-04-25 23:54:52.514218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.091 [2024-04-25 23:54:52.514239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.514354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.092 [2024-04-25 23:54:52.514378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.514512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.092 [2024-04-25 23:54:52.514531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.092 #50 NEW cov: 11775 ft: 14611 corp: 38/1443b lim: 50 exec/s: 50 rss: 70Mb L: 46/49 MS: 1 PersAutoDict- DE: "\264)\012\002\000\000\000\000"- 00:08:03.092 [2024-04-25 23:54:52.554260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.092 [2024-04-25 23:54:52.554289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.554373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.092 [2024-04-25 23:54:52.554399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.554528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.092 [2024-04-25 23:54:52.554545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.554668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:03.092 [2024-04-25 23:54:52.554693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.092 #51 NEW cov: 11775 ft: 14629 corp: 39/1491b lim: 50 exec/s: 51 rss: 70Mb L: 48/49 MS: 1 InsertRepeatedBytes- 00:08:03.092 [2024-04-25 23:54:52.593771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.092 [2024-04-25 23:54:52.593797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.593921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.092 [2024-04-25 23:54:52.593947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.092 #52 NEW cov: 11775 ft: 14635 corp: 40/1516b lim: 50 exec/s: 52 rss: 70Mb L: 25/49 MS: 1 EraseBytes- 00:08:03.092 [2024-04-25 23:54:52.634072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:03.092 [2024-04-25 23:54:52.634102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.634205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:03.092 [2024-04-25 23:54:52.634230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.092 [2024-04-25 23:54:52.634355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:03.092 [2024-04-25 23:54:52.634376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.092 #53 NEW cov: 11775 ft: 14644 corp: 41/1546b lim: 50 exec/s: 26 rss: 70Mb L: 30/49 MS: 1 EraseBytes- 00:08:03.092 #53 DONE cov: 11775 ft: 14644 corp: 41/1546b lim: 50 exec/s: 26 rss: 70Mb 00:08:03.092 ###### Recommended dictionary. ###### 00:08:03.092 "\264)\012\002\000\000\000\000" # Uses: 4 00:08:03.092 "\377\377\377\377\377\377\377\377" # Uses: 3 00:08:03.092 "\000\000\000\011" # Uses: 0 00:08:03.092 "\001\000\000\000" # Uses: 0 00:08:03.092 ###### End of recommended dictionary. ###### 00:08:03.092 Done 53 runs in 2 second(s) 00:08:03.358 23:54:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:03.358 23:54:52 -- ../common.sh@72 -- # (( i++ )) 00:08:03.358 23:54:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.358 23:54:52 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:03.358 23:54:52 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:03.358 23:54:52 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.358 23:54:52 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.358 23:54:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:03.358 23:54:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:03.358 23:54:52 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:03.358 23:54:52 -- nvmf/run.sh@29 -- # port=4422 00:08:03.358 23:54:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:03.358 23:54:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:03.358 23:54:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.358 23:54:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:03.358 [2024-04-25 23:54:52.809632] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:03.358 [2024-04-25 23:54:52.809702] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481393 ] 00:08:03.358 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.618 [2024-04-25 23:54:52.985247] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.618 [2024-04-25 23:54:53.004379] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.618 [2024-04-25 23:54:53.004523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.618 [2024-04-25 23:54:53.055929] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.618 [2024-04-25 23:54:53.072201] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:03.618 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.618 INFO: Seed: 39831878 00:08:03.618 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:03.618 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:03.618 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:03.618 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.618 #2 INITED exec/s: 0 rss: 59Mb 00:08:03.618 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.618 This may also happen if the target rejected all inputs we tried so far 00:08:03.618 [2024-04-25 23:54:53.117462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:03.618 [2024-04-25 23:54:53.117492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.618 [2024-04-25 23:54:53.117555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:03.618 [2024-04-25 23:54:53.117570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.618 [2024-04-25 23:54:53.117624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:03.618 [2024-04-25 23:54:53.117639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.877 NEW_FUNC[1/664]: 0x4c4bc0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:03.877 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.877 #3 NEW cov: 11571 ft: 11572 corp: 2/55b lim: 85 exec/s: 0 rss: 66Mb L: 54/54 MS: 1 InsertRepeatedBytes- 00:08:03.877 [2024-04-25 23:54:53.438059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:03.877 [2024-04-25 23:54:53.438090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.877 NEW_FUNC[1/1]: 0x1738e40 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1088 00:08:03.877 #7 NEW cov: 11687 ft: 12826 corp: 3/74b lim: 85 exec/s: 0 rss: 67Mb L: 19/54 MS: 4 ChangeBit-ChangeBit-CrossOver-CMP- DE: "\377v\025k}\375\013\254"- 00:08:03.877 [2024-04-25 23:54:53.478043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:03.877 [2024-04-25 23:54:53.478069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.135 #10 NEW cov: 11693 ft: 12989 corp: 4/92b lim: 85 exec/s: 0 rss: 67Mb L: 18/54 MS: 3 PersAutoDict-CrossOver-PersAutoDict- DE: "\377v\025k}\375\013\254"-"\377v\025k}\375\013\254"- 00:08:04.135 [2024-04-25 23:54:53.518409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.135 [2024-04-25 23:54:53.518435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.135 [2024-04-25 23:54:53.518488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.135 [2024-04-25 23:54:53.518511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.135 [2024-04-25 23:54:53.518561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.135 [2024-04-25 23:54:53.518576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.135 #11 NEW cov: 11778 ft: 13369 corp: 5/147b lim: 85 exec/s: 0 rss: 67Mb L: 55/55 MS: 1 InsertByte- 00:08:04.135 [2024-04-25 23:54:53.558260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.135 [2024-04-25 23:54:53.558286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.135 #12 NEW cov: 11778 ft: 13464 corp: 6/165b lim: 85 exec/s: 0 rss: 67Mb L: 18/55 MS: 1 PersAutoDict- DE: "\377v\025k}\375\013\254"- 00:08:04.135 [2024-04-25 23:54:53.598663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.135 [2024-04-25 23:54:53.598689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.135 [2024-04-25 23:54:53.598747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.135 [2024-04-25 23:54:53.598762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.135 [2024-04-25 23:54:53.598814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.135 [2024-04-25 23:54:53.598828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.135 #13 NEW cov: 11778 ft: 13537 corp: 7/227b lim: 85 exec/s: 0 rss: 67Mb L: 62/62 MS: 1 PersAutoDict- DE: "\377v\025k}\375\013\254"- 00:08:04.135 [2024-04-25 23:54:53.638503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.135 [2024-04-25 23:54:53.638528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.135 #14 NEW cov: 11778 ft: 13617 corp: 8/245b lim: 85 exec/s: 0 rss: 67Mb L: 18/62 MS: 1 ChangeBit- 00:08:04.135 [2024-04-25 23:54:53.679030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.135 [2024-04-25 23:54:53.679056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.135 [2024-04-25 23:54:53.679096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.135 [2024-04-25 23:54:53.679110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.136 [2024-04-25 23:54:53.679158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.136 [2024-04-25 23:54:53.679172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.136 [2024-04-25 23:54:53.679222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:04.136 [2024-04-25 23:54:53.679237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.136 #15 NEW cov: 11778 ft: 14025 corp: 9/314b lim: 85 exec/s: 0 rss: 67Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:08:04.136 [2024-04-25 23:54:53.719063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.136 [2024-04-25 23:54:53.719090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.136 [2024-04-25 23:54:53.719140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.136 [2024-04-25 23:54:53.719161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.136 [2024-04-25 23:54:53.719211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.136 [2024-04-25 23:54:53.719227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.136 #16 NEW cov: 11778 ft: 14084 corp: 10/369b lim: 85 exec/s: 0 rss: 68Mb L: 55/69 MS: 1 InsertByte- 00:08:04.394 [2024-04-25 23:54:53.749115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.749142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.749186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.749201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.749252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.395 [2024-04-25 23:54:53.749267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.395 #17 NEW cov: 11778 ft: 14128 corp: 11/425b lim: 85 exec/s: 0 rss: 68Mb L: 56/69 MS: 1 InsertByte- 00:08:04.395 [2024-04-25 23:54:53.789224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.789249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.789286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.789301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.789355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.395 [2024-04-25 23:54:53.789370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.395 #18 NEW cov: 11778 ft: 14156 corp: 12/479b lim: 85 exec/s: 0 rss: 69Mb L: 54/69 MS: 1 ChangeBinInt- 00:08:04.395 [2024-04-25 23:54:53.819459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.819488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.819541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.819556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.819615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.395 [2024-04-25 23:54:53.819630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.819681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:04.395 [2024-04-25 23:54:53.819695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.395 #19 NEW cov: 11778 ft: 14216 corp: 13/560b lim: 85 exec/s: 0 rss: 69Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:04.395 [2024-04-25 23:54:53.859507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.859533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.859569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.859583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.859634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.395 [2024-04-25 23:54:53.859649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.395 #20 NEW cov: 11778 ft: 14239 corp: 14/619b lim: 85 exec/s: 0 rss: 69Mb L: 59/81 MS: 1 CopyPart- 00:08:04.395 [2024-04-25 23:54:53.899571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.899597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.899629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.899644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.899700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.395 [2024-04-25 23:54:53.899715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.395 #21 NEW cov: 11778 ft: 14247 corp: 15/676b lim: 85 exec/s: 0 rss: 69Mb L: 57/81 MS: 1 InsertRepeatedBytes- 00:08:04.395 [2024-04-25 23:54:53.929776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.929801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.929862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.929877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.929927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.395 [2024-04-25 23:54:53.929942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.929994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:04.395 [2024-04-25 23:54:53.930009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.395 #22 NEW cov: 11778 ft: 14299 corp: 16/745b lim: 85 exec/s: 0 rss: 69Mb L: 69/81 MS: 1 ShuffleBytes- 00:08:04.395 [2024-04-25 23:54:53.969638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.395 [2024-04-25 23:54:53.969664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.395 [2024-04-25 23:54:53.969699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.395 [2024-04-25 23:54:53.969714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.395 #23 NEW cov: 11778 ft: 14595 corp: 17/793b lim: 85 exec/s: 0 rss: 69Mb L: 48/81 MS: 1 EraseBytes- 00:08:04.654 [2024-04-25 23:54:54.009898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.654 [2024-04-25 23:54:54.009923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.009959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.654 [2024-04-25 23:54:54.009974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.010025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.654 [2024-04-25 23:54:54.010040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.654 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.654 #24 NEW cov: 11801 ft: 14636 corp: 18/856b lim: 85 exec/s: 0 rss: 69Mb L: 63/81 MS: 1 PersAutoDict- DE: "\377v\025k}\375\013\254"- 00:08:04.654 [2024-04-25 23:54:54.050147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.654 [2024-04-25 23:54:54.050173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.050219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.654 [2024-04-25 23:54:54.050240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.050288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.654 [2024-04-25 23:54:54.050301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.050350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:04.654 [2024-04-25 23:54:54.050365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.654 #25 NEW cov: 11801 ft: 14656 corp: 19/938b lim: 85 exec/s: 0 rss: 69Mb L: 82/82 MS: 1 InsertByte- 00:08:04.654 [2024-04-25 23:54:54.090250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.654 [2024-04-25 23:54:54.090275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.090321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.654 [2024-04-25 23:54:54.090343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.090398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.654 [2024-04-25 23:54:54.090413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.090462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:04.654 [2024-04-25 23:54:54.090476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.654 #26 NEW cov: 11801 ft: 14667 corp: 20/1022b lim: 85 exec/s: 26 rss: 69Mb L: 84/84 MS: 1 CopyPart- 00:08:04.654 [2024-04-25 23:54:54.130223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.654 [2024-04-25 23:54:54.130247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.654 [2024-04-25 23:54:54.130283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.655 [2024-04-25 23:54:54.130298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.130354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.655 [2024-04-25 23:54:54.130369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.655 #27 NEW cov: 11801 ft: 14719 corp: 21/1085b lim: 85 exec/s: 27 rss: 69Mb L: 63/84 MS: 1 CrossOver- 00:08:04.655 [2024-04-25 23:54:54.170489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.655 [2024-04-25 23:54:54.170515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.170561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.655 [2024-04-25 23:54:54.170581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.170631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.655 [2024-04-25 23:54:54.170645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.170694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:04.655 [2024-04-25 23:54:54.170708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.655 #28 NEW cov: 11801 ft: 14756 corp: 22/1153b lim: 85 exec/s: 28 rss: 70Mb L: 68/84 MS: 1 InsertRepeatedBytes- 00:08:04.655 [2024-04-25 23:54:54.210467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.655 [2024-04-25 23:54:54.210493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.210536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.655 [2024-04-25 23:54:54.210558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.210608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.655 [2024-04-25 23:54:54.210622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.655 #29 NEW cov: 11801 ft: 14766 corp: 23/1208b lim: 85 exec/s: 29 rss: 70Mb L: 55/84 MS: 1 ChangeBinInt- 00:08:04.655 [2024-04-25 23:54:54.240530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.655 [2024-04-25 23:54:54.240557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.240591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.655 [2024-04-25 23:54:54.240606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.655 [2024-04-25 23:54:54.240660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.655 [2024-04-25 23:54:54.240674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.913 #30 NEW cov: 11801 ft: 14815 corp: 24/1262b lim: 85 exec/s: 30 rss: 70Mb L: 54/84 MS: 1 ChangeBinInt- 00:08:04.913 [2024-04-25 23:54:54.280409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.280433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 #31 NEW cov: 11801 ft: 14832 corp: 25/1280b lim: 85 exec/s: 31 rss: 70Mb L: 18/84 MS: 1 ChangeBit- 00:08:04.913 [2024-04-25 23:54:54.320754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.320779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.320820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.913 [2024-04-25 23:54:54.320835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.320886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.913 [2024-04-25 23:54:54.320900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.913 #32 NEW cov: 11801 ft: 14842 corp: 26/1335b lim: 85 exec/s: 32 rss: 70Mb L: 55/84 MS: 1 ChangeASCIIInt- 00:08:04.913 [2024-04-25 23:54:54.360616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.360641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 #33 NEW cov: 11801 ft: 14845 corp: 27/1361b lim: 85 exec/s: 33 rss: 70Mb L: 26/84 MS: 1 CrossOver- 00:08:04.913 [2024-04-25 23:54:54.400963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.400987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.401039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.913 [2024-04-25 23:54:54.401058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.401111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.913 [2024-04-25 23:54:54.401125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.913 #34 NEW cov: 11801 ft: 14858 corp: 28/1416b lim: 85 exec/s: 34 rss: 70Mb L: 55/84 MS: 1 CMP- DE: "\377\002"- 00:08:04.913 [2024-04-25 23:54:54.431078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.431103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.431146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.913 [2024-04-25 23:54:54.431171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.431222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.913 [2024-04-25 23:54:54.431236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.913 #35 NEW cov: 11801 ft: 14872 corp: 29/1480b lim: 85 exec/s: 35 rss: 70Mb L: 64/84 MS: 1 PersAutoDict- DE: "\377\002"- 00:08:04.913 [2024-04-25 23:54:54.471082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.471107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.471143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.913 [2024-04-25 23:54:54.471156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.913 #36 NEW cov: 11801 ft: 14902 corp: 30/1514b lim: 85 exec/s: 36 rss: 70Mb L: 34/84 MS: 1 CMP- DE: "H\000\000\000\000\000\000\000"- 00:08:04.913 [2024-04-25 23:54:54.511327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:04.913 [2024-04-25 23:54:54.511354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.511390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:04.913 [2024-04-25 23:54:54.511411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.913 [2024-04-25 23:54:54.511462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:04.913 [2024-04-25 23:54:54.511477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.171 #37 NEW cov: 11801 ft: 14933 corp: 31/1569b lim: 85 exec/s: 37 rss: 70Mb L: 55/84 MS: 1 ChangeASCIIInt- 00:08:05.171 [2024-04-25 23:54:54.541116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.171 [2024-04-25 23:54:54.541142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.171 #38 NEW cov: 11801 ft: 15037 corp: 32/1589b lim: 85 exec/s: 38 rss: 70Mb L: 20/84 MS: 1 InsertByte- 00:08:05.171 [2024-04-25 23:54:54.581667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.171 [2024-04-25 23:54:54.581693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.171 [2024-04-25 23:54:54.581738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.171 [2024-04-25 23:54:54.581761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.171 [2024-04-25 23:54:54.581811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.171 [2024-04-25 23:54:54.581826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.171 [2024-04-25 23:54:54.581877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:05.171 [2024-04-25 23:54:54.581893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.171 #39 NEW cov: 11801 ft: 15053 corp: 33/1670b lim: 85 exec/s: 39 rss: 70Mb L: 81/84 MS: 1 CopyPart- 00:08:05.171 [2024-04-25 23:54:54.621650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.171 [2024-04-25 23:54:54.621678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.171 [2024-04-25 23:54:54.621716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.172 [2024-04-25 23:54:54.621731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.621782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.172 [2024-04-25 23:54:54.621796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.172 #40 NEW cov: 11801 ft: 15057 corp: 34/1733b lim: 85 exec/s: 40 rss: 70Mb L: 63/84 MS: 1 PersAutoDict- DE: "\377v\025k}\375\013\254"- 00:08:05.172 [2024-04-25 23:54:54.661812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.172 [2024-04-25 23:54:54.661838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.661885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.172 [2024-04-25 23:54:54.661906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.661955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.172 [2024-04-25 23:54:54.661970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.172 #41 NEW cov: 11801 ft: 15095 corp: 35/1795b lim: 85 exec/s: 41 rss: 70Mb L: 62/84 MS: 1 ChangeByte- 00:08:05.172 [2024-04-25 23:54:54.701830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.172 [2024-04-25 23:54:54.701857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.701900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.172 [2024-04-25 23:54:54.701923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.701974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.172 [2024-04-25 23:54:54.701988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.172 #42 NEW cov: 11801 ft: 15107 corp: 36/1852b lim: 85 exec/s: 42 rss: 70Mb L: 57/84 MS: 1 ChangeBinInt- 00:08:05.172 [2024-04-25 23:54:54.742127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.172 [2024-04-25 23:54:54.742152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.742198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.172 [2024-04-25 23:54:54.742221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.742270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.172 [2024-04-25 23:54:54.742284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.742334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:05.172 [2024-04-25 23:54:54.742353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.172 #43 NEW cov: 11801 ft: 15123 corp: 37/1933b lim: 85 exec/s: 43 rss: 70Mb L: 81/84 MS: 1 CopyPart- 00:08:05.172 [2024-04-25 23:54:54.781976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.172 [2024-04-25 23:54:54.782004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.172 [2024-04-25 23:54:54.782058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.172 [2024-04-25 23:54:54.782074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.431 #44 NEW cov: 11801 ft: 15130 corp: 38/1981b lim: 85 exec/s: 44 rss: 70Mb L: 48/84 MS: 1 CopyPart- 00:08:05.431 [2024-04-25 23:54:54.822352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.431 [2024-04-25 23:54:54.822378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.431 [2024-04-25 23:54:54.822445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.431 [2024-04-25 23:54:54.822462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.431 [2024-04-25 23:54:54.822512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.431 [2024-04-25 23:54:54.822528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.431 [2024-04-25 23:54:54.822578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:05.431 [2024-04-25 23:54:54.822595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.431 #45 NEW cov: 11801 ft: 15144 corp: 39/2065b lim: 85 exec/s: 45 rss: 70Mb L: 84/84 MS: 1 ChangeBit- 00:08:05.431 [2024-04-25 23:54:54.862059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.431 [2024-04-25 23:54:54.862086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.431 #46 NEW cov: 11801 ft: 15181 corp: 40/2088b lim: 85 exec/s: 46 rss: 70Mb L: 23/84 MS: 1 CMP- DE: "\001\000\000\177"- 00:08:05.431 [2024-04-25 23:54:54.902165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.431 [2024-04-25 23:54:54.902190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.431 #47 NEW cov: 11801 ft: 15223 corp: 41/2111b lim: 85 exec/s: 47 rss: 70Mb L: 23/84 MS: 1 ShuffleBytes- 00:08:05.431 [2024-04-25 23:54:54.942437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.431 [2024-04-25 23:54:54.942462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.431 [2024-04-25 23:54:54.942496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.431 [2024-04-25 23:54:54.942512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.431 #48 NEW cov: 11801 ft: 15224 corp: 42/2145b lim: 85 exec/s: 48 rss: 70Mb L: 34/84 MS: 1 ChangeBit- 00:08:05.431 [2024-04-25 23:54:54.982437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.431 [2024-04-25 23:54:54.982463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.431 #49 NEW cov: 11801 ft: 15230 corp: 43/2163b lim: 85 exec/s: 49 rss: 70Mb L: 18/84 MS: 1 CrossOver- 00:08:05.431 [2024-04-25 23:54:55.022816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.431 [2024-04-25 23:54:55.022845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.431 [2024-04-25 23:54:55.022896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.431 [2024-04-25 23:54:55.022919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.431 [2024-04-25 23:54:55.022972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.431 [2024-04-25 23:54:55.022985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.690 #50 NEW cov: 11801 ft: 15302 corp: 44/2226b lim: 85 exec/s: 50 rss: 71Mb L: 63/84 MS: 1 PersAutoDict- DE: "\001\000\000\177"- 00:08:05.690 [2024-04-25 23:54:55.062897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.690 [2024-04-25 23:54:55.062923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.690 [2024-04-25 23:54:55.062958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.690 [2024-04-25 23:54:55.062973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.690 [2024-04-25 23:54:55.063028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:05.690 [2024-04-25 23:54:55.063043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.690 #51 NEW cov: 11801 ft: 15305 corp: 45/2289b lim: 85 exec/s: 51 rss: 71Mb L: 63/84 MS: 1 PersAutoDict- DE: "\377v\025k}\375\013\254"- 00:08:05.690 [2024-04-25 23:54:55.102913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:05.690 [2024-04-25 23:54:55.102939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.690 [2024-04-25 23:54:55.102990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:05.690 [2024-04-25 23:54:55.103014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.690 #52 NEW cov: 11801 ft: 15326 corp: 46/2337b lim: 85 exec/s: 26 rss: 71Mb L: 48/84 MS: 1 ShuffleBytes- 00:08:05.690 #52 DONE cov: 11801 ft: 15326 corp: 46/2337b lim: 85 exec/s: 26 rss: 71Mb 00:08:05.690 ###### Recommended dictionary. ###### 00:08:05.690 "\377v\025k}\375\013\254" # Uses: 7 00:08:05.690 "\377\002" # Uses: 1 00:08:05.690 "H\000\000\000\000\000\000\000" # Uses: 0 00:08:05.690 "\001\000\000\177" # Uses: 1 00:08:05.690 ###### End of recommended dictionary. ###### 00:08:05.690 Done 52 runs in 2 second(s) 00:08:05.690 23:54:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:05.690 23:54:55 -- ../common.sh@72 -- # (( i++ )) 00:08:05.690 23:54:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.690 23:54:55 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:05.690 23:54:55 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:05.690 23:54:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.690 23:54:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.690 23:54:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:05.690 23:54:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:05.690 23:54:55 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:05.690 23:54:55 -- nvmf/run.sh@29 -- # port=4423 00:08:05.690 23:54:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:05.690 23:54:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:05.690 23:54:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.690 23:54:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:05.690 [2024-04-25 23:54:55.286551] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:05.690 [2024-04-25 23:54:55.286645] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481930 ] 00:08:05.949 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.949 [2024-04-25 23:54:55.460617] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.949 [2024-04-25 23:54:55.479413] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.949 [2024-04-25 23:54:55.479534] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.949 [2024-04-25 23:54:55.530874] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.949 [2024-04-25 23:54:55.547172] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:06.207 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.207 INFO: Seed: 2513852612 00:08:06.207 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:06.207 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:06.207 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:06.207 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.207 #2 INITED exec/s: 0 rss: 59Mb 00:08:06.207 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.207 This may also happen if the target rejected all inputs we tried so far 00:08:06.207 [2024-04-25 23:54:55.592160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.207 [2024-04-25 23:54:55.592191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.466 NEW_FUNC[1/664]: 0x4c7df0 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:06.466 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.466 #5 NEW cov: 11507 ft: 11508 corp: 2/6b lim: 25 exec/s: 0 rss: 67Mb L: 5/5 MS: 3 CrossOver-CMP-InsertByte- DE: "\000\001"- 00:08:06.466 [2024-04-25 23:54:55.913514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.466 [2024-04-25 23:54:55.913555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.466 [2024-04-25 23:54:55.913620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:06.466 [2024-04-25 23:54:55.913641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.466 [2024-04-25 23:54:55.913702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:06.466 [2024-04-25 23:54:55.913723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.466 [2024-04-25 23:54:55.913783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:06.466 [2024-04-25 23:54:55.913803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.466 [2024-04-25 23:54:55.913865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:06.466 [2024-04-25 23:54:55.913886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:06.466 #6 NEW cov: 11620 ft: 12650 corp: 3/31b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:06.466 [2024-04-25 23:54:55.963089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.466 [2024-04-25 23:54:55.963117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.466 #7 NEW cov: 11626 ft: 12913 corp: 4/40b lim: 25 exec/s: 0 rss: 67Mb L: 9/25 MS: 1 CopyPart- 00:08:06.466 [2024-04-25 23:54:56.003215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.466 [2024-04-25 23:54:56.003241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.466 #8 NEW cov: 11711 ft: 13148 corp: 5/45b lim: 25 exec/s: 0 rss: 67Mb L: 5/25 MS: 1 ChangeBit- 00:08:06.466 [2024-04-25 23:54:56.043323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.466 [2024-04-25 23:54:56.043349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.466 #9 NEW cov: 11711 ft: 13212 corp: 6/51b lim: 25 exec/s: 0 rss: 67Mb L: 6/25 MS: 1 InsertByte- 00:08:06.466 [2024-04-25 23:54:56.073359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.466 [2024-04-25 23:54:56.073386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #10 NEW cov: 11711 ft: 13258 corp: 7/56b lim: 25 exec/s: 0 rss: 67Mb L: 5/25 MS: 1 PersAutoDict- DE: "\000\001"- 00:08:06.725 [2024-04-25 23:54:56.113506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.113534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #11 NEW cov: 11711 ft: 13327 corp: 8/62b lim: 25 exec/s: 0 rss: 68Mb L: 6/25 MS: 1 CrossOver- 00:08:06.725 [2024-04-25 23:54:56.153605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.153632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #12 NEW cov: 11711 ft: 13430 corp: 9/68b lim: 25 exec/s: 0 rss: 68Mb L: 6/25 MS: 1 InsertByte- 00:08:06.725 [2024-04-25 23:54:56.183814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.183841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #13 NEW cov: 11711 ft: 13453 corp: 10/73b lim: 25 exec/s: 0 rss: 68Mb L: 5/25 MS: 1 ChangeBit- 00:08:06.725 [2024-04-25 23:54:56.223859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.223884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #15 NEW cov: 11711 ft: 13510 corp: 11/79b lim: 25 exec/s: 0 rss: 68Mb L: 6/25 MS: 2 EraseBytes-PersAutoDict- DE: "\000\001"- 00:08:06.725 [2024-04-25 23:54:56.253920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.253947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #16 NEW cov: 11711 ft: 13569 corp: 12/88b lim: 25 exec/s: 0 rss: 68Mb L: 9/25 MS: 1 CrossOver- 00:08:06.725 [2024-04-25 23:54:56.294089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.294115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.725 #17 NEW cov: 11711 ft: 13630 corp: 13/94b lim: 25 exec/s: 0 rss: 68Mb L: 6/25 MS: 1 InsertByte- 00:08:06.725 [2024-04-25 23:54:56.324116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.725 [2024-04-25 23:54:56.324143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 #18 NEW cov: 11711 ft: 13655 corp: 14/102b lim: 25 exec/s: 0 rss: 68Mb L: 8/25 MS: 1 PersAutoDict- DE: "\000\001"- 00:08:06.984 [2024-04-25 23:54:56.364472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.364500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.364538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:06.984 [2024-04-25 23:54:56.364554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.364608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:06.984 [2024-04-25 23:54:56.364622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.984 #19 NEW cov: 11711 ft: 14000 corp: 15/119b lim: 25 exec/s: 0 rss: 68Mb L: 17/25 MS: 1 CMP- DE: "\263)\012\002\000\000\000\000"- 00:08:06.984 [2024-04-25 23:54:56.404665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.404692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.404753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:06.984 [2024-04-25 23:54:56.404769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.404818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:06.984 [2024-04-25 23:54:56.404831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.404886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:06.984 [2024-04-25 23:54:56.404903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.984 #20 NEW cov: 11711 ft: 14029 corp: 16/141b lim: 25 exec/s: 0 rss: 68Mb L: 22/25 MS: 1 InsertRepeatedBytes- 00:08:06.984 [2024-04-25 23:54:56.444478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.444504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 #21 NEW cov: 11711 ft: 14045 corp: 17/148b lim: 25 exec/s: 0 rss: 68Mb L: 7/25 MS: 1 CopyPart- 00:08:06.984 [2024-04-25 23:54:56.474580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.474606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.984 #22 NEW cov: 11734 ft: 14131 corp: 18/154b lim: 25 exec/s: 0 rss: 68Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:06.984 [2024-04-25 23:54:56.514662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.514688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 #23 NEW cov: 11734 ft: 14183 corp: 19/160b lim: 25 exec/s: 0 rss: 69Mb L: 6/25 MS: 1 CopyPart- 00:08:06.984 [2024-04-25 23:54:56.555125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.555155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.555192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:06.984 [2024-04-25 23:54:56.555207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.555259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:06.984 [2024-04-25 23:54:56.555275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.984 [2024-04-25 23:54:56.555329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:06.984 [2024-04-25 23:54:56.555344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.984 #24 NEW cov: 11734 ft: 14231 corp: 20/181b lim: 25 exec/s: 0 rss: 69Mb L: 21/25 MS: 1 InsertRepeatedBytes- 00:08:06.984 [2024-04-25 23:54:56.594887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:06.984 [2024-04-25 23:54:56.594914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 #25 NEW cov: 11734 ft: 14293 corp: 21/189b lim: 25 exec/s: 25 rss: 69Mb L: 8/25 MS: 1 EraseBytes- 00:08:07.243 [2024-04-25 23:54:56.635013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.243 [2024-04-25 23:54:56.635039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 #26 NEW cov: 11734 ft: 14304 corp: 22/194b lim: 25 exec/s: 26 rss: 69Mb L: 5/25 MS: 1 EraseBytes- 00:08:07.243 [2024-04-25 23:54:56.675353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.243 [2024-04-25 23:54:56.675381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 [2024-04-25 23:54:56.675422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.243 [2024-04-25 23:54:56.675437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.243 [2024-04-25 23:54:56.675490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:07.243 [2024-04-25 23:54:56.675505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.243 #27 NEW cov: 11734 ft: 14310 corp: 23/213b lim: 25 exec/s: 27 rss: 69Mb L: 19/25 MS: 1 InsertRepeatedBytes- 00:08:07.243 [2024-04-25 23:54:56.715274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.243 [2024-04-25 23:54:56.715303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 #28 NEW cov: 11734 ft: 14386 corp: 24/221b lim: 25 exec/s: 28 rss: 69Mb L: 8/25 MS: 1 ShuffleBytes- 00:08:07.243 [2024-04-25 23:54:56.755354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.243 [2024-04-25 23:54:56.755380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 #29 NEW cov: 11734 ft: 14393 corp: 25/228b lim: 25 exec/s: 29 rss: 69Mb L: 7/25 MS: 1 InsertByte- 00:08:07.243 [2024-04-25 23:54:56.795422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.243 [2024-04-25 23:54:56.795448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 #30 NEW cov: 11734 ft: 14401 corp: 26/235b lim: 25 exec/s: 30 rss: 69Mb L: 7/25 MS: 1 CopyPart- 00:08:07.243 [2024-04-25 23:54:56.835599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.243 [2024-04-25 23:54:56.835627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.243 #31 NEW cov: 11734 ft: 14411 corp: 27/241b lim: 25 exec/s: 31 rss: 69Mb L: 6/25 MS: 1 InsertByte- 00:08:07.502 [2024-04-25 23:54:56.865640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:56.865667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 #32 NEW cov: 11734 ft: 14485 corp: 28/246b lim: 25 exec/s: 32 rss: 69Mb L: 5/25 MS: 1 ShuffleBytes- 00:08:07.502 [2024-04-25 23:54:56.895742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:56.895767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 #33 NEW cov: 11734 ft: 14513 corp: 29/253b lim: 25 exec/s: 33 rss: 69Mb L: 7/25 MS: 1 ChangeByte- 00:08:07.502 [2024-04-25 23:54:56.936227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:56.936253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:56.936302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.502 [2024-04-25 23:54:56.936318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:56.936370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:07.502 [2024-04-25 23:54:56.936386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:56.936460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:07.502 [2024-04-25 23:54:56.936477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.502 #34 NEW cov: 11734 ft: 14560 corp: 30/277b lim: 25 exec/s: 34 rss: 69Mb L: 24/25 MS: 1 CopyPart- 00:08:07.502 [2024-04-25 23:54:56.976086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:56.976113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:56.976151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.502 [2024-04-25 23:54:56.976167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.502 #39 NEW cov: 11734 ft: 14769 corp: 31/287b lim: 25 exec/s: 39 rss: 69Mb L: 10/25 MS: 5 EraseBytes-EraseBytes-CopyPart-CopyPart-PersAutoDict- DE: "\263)\012\002\000\000\000\000"- 00:08:07.502 [2024-04-25 23:54:57.016482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:57.016508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:57.016554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.502 [2024-04-25 23:54:57.016569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:57.016624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:07.502 [2024-04-25 23:54:57.016640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.502 [2024-04-25 23:54:57.016698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:07.502 [2024-04-25 23:54:57.016714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.502 #40 NEW cov: 11734 ft: 14780 corp: 32/308b lim: 25 exec/s: 40 rss: 69Mb L: 21/25 MS: 1 ChangeBinInt- 00:08:07.502 [2024-04-25 23:54:57.056210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:57.056237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 #41 NEW cov: 11734 ft: 14795 corp: 33/314b lim: 25 exec/s: 41 rss: 69Mb L: 6/25 MS: 1 ChangeBit- 00:08:07.502 [2024-04-25 23:54:57.086315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.502 [2024-04-25 23:54:57.086341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.502 #42 NEW cov: 11734 ft: 14803 corp: 34/323b lim: 25 exec/s: 42 rss: 69Mb L: 9/25 MS: 1 CrossOver- 00:08:07.761 [2024-04-25 23:54:57.126451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.126478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 #43 NEW cov: 11734 ft: 14811 corp: 35/329b lim: 25 exec/s: 43 rss: 69Mb L: 6/25 MS: 1 ChangeByte- 00:08:07.761 [2024-04-25 23:54:57.166763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.166790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.166829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.761 [2024-04-25 23:54:57.166846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.166901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:07.761 [2024-04-25 23:54:57.166918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.761 #44 NEW cov: 11734 ft: 14826 corp: 36/347b lim: 25 exec/s: 44 rss: 69Mb L: 18/25 MS: 1 CopyPart- 00:08:07.761 [2024-04-25 23:54:57.206688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.206714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 #45 NEW cov: 11734 ft: 14836 corp: 37/352b lim: 25 exec/s: 45 rss: 69Mb L: 5/25 MS: 1 ShuffleBytes- 00:08:07.761 [2024-04-25 23:54:57.236729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.236756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 #46 NEW cov: 11734 ft: 14890 corp: 38/359b lim: 25 exec/s: 46 rss: 69Mb L: 7/25 MS: 1 InsertByte- 00:08:07.761 [2024-04-25 23:54:57.267159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.267186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.267236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.761 [2024-04-25 23:54:57.267253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.267308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:07.761 [2024-04-25 23:54:57.267328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.267384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:07.761 [2024-04-25 23:54:57.267405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.761 #47 NEW cov: 11734 ft: 14896 corp: 39/381b lim: 25 exec/s: 47 rss: 70Mb L: 22/25 MS: 1 EraseBytes- 00:08:07.761 [2024-04-25 23:54:57.306942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.306968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 #48 NEW cov: 11734 ft: 14907 corp: 40/386b lim: 25 exec/s: 48 rss: 70Mb L: 5/25 MS: 1 ChangeBinInt- 00:08:07.761 [2024-04-25 23:54:57.347507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:07.761 [2024-04-25 23:54:57.347534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.347577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:07.761 [2024-04-25 23:54:57.347594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.347648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:07.761 [2024-04-25 23:54:57.347664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.347718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:07.761 [2024-04-25 23:54:57.347734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.761 [2024-04-25 23:54:57.347789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:07.761 [2024-04-25 23:54:57.347805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:08.020 #49 NEW cov: 11734 ft: 14943 corp: 41/411b lim: 25 exec/s: 49 rss: 70Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:08.021 [2024-04-25 23:54:57.387151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.021 [2024-04-25 23:54:57.387177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.021 #50 NEW cov: 11734 ft: 14995 corp: 42/419b lim: 25 exec/s: 50 rss: 70Mb L: 8/25 MS: 1 PersAutoDict- DE: "\000\001"- 00:08:08.021 [2024-04-25 23:54:57.427254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.021 [2024-04-25 23:54:57.427281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.021 #51 NEW cov: 11734 ft: 15004 corp: 43/426b lim: 25 exec/s: 51 rss: 70Mb L: 7/25 MS: 1 InsertByte- 00:08:08.021 [2024-04-25 23:54:57.467530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.021 [2024-04-25 23:54:57.467556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.021 [2024-04-25 23:54:57.467594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:08.021 [2024-04-25 23:54:57.467610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.021 #52 NEW cov: 11734 ft: 15036 corp: 44/437b lim: 25 exec/s: 52 rss: 70Mb L: 11/25 MS: 1 CopyPart- 00:08:08.021 [2024-04-25 23:54:57.507527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.021 [2024-04-25 23:54:57.507553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.021 #55 NEW cov: 11734 ft: 15072 corp: 45/443b lim: 25 exec/s: 55 rss: 70Mb L: 6/25 MS: 3 EraseBytes-CopyPart-CopyPart- 00:08:08.021 [2024-04-25 23:54:57.547610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.021 [2024-04-25 23:54:57.547636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.021 #56 NEW cov: 11734 ft: 15108 corp: 46/449b lim: 25 exec/s: 56 rss: 70Mb L: 6/25 MS: 1 InsertByte- 00:08:08.021 [2024-04-25 23:54:57.587758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:08.021 [2024-04-25 23:54:57.587784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.021 #57 NEW cov: 11734 ft: 15123 corp: 47/457b lim: 25 exec/s: 28 rss: 70Mb L: 8/25 MS: 1 ChangeBit- 00:08:08.021 #57 DONE cov: 11734 ft: 15123 corp: 47/457b lim: 25 exec/s: 28 rss: 70Mb 00:08:08.021 ###### Recommended dictionary. ###### 00:08:08.021 "\000\001" # Uses: 4 00:08:08.021 "\263)\012\002\000\000\000\000" # Uses: 1 00:08:08.021 ###### End of recommended dictionary. ###### 00:08:08.021 Done 57 runs in 2 second(s) 00:08:08.280 23:54:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:08.280 23:54:57 -- ../common.sh@72 -- # (( i++ )) 00:08:08.280 23:54:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.280 23:54:57 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:08.280 23:54:57 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:08.280 23:54:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.280 23:54:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.280 23:54:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:08.280 23:54:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:08.280 23:54:57 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:08.280 23:54:57 -- nvmf/run.sh@29 -- # port=4424 00:08:08.280 23:54:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:08.280 23:54:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:08.280 23:54:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.280 23:54:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:08.280 [2024-04-25 23:54:57.766657] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:08.280 [2024-04-25 23:54:57.766728] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482227 ] 00:08:08.280 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.540 [2024-04-25 23:54:57.942361] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.540 [2024-04-25 23:54:57.961320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.540 [2024-04-25 23:54:57.961468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.540 [2024-04-25 23:54:58.012935] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.540 [2024-04-25 23:54:58.029226] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:08.540 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.540 INFO: Seed: 701887610 00:08:08.540 INFO: Loaded 1 modules (341207 inline 8-bit counters): 341207 [0x26a854c, 0x26fba23), 00:08:08.540 INFO: Loaded 1 PC tables (341207 PCs): 341207 [0x26fba28,0x2c30798), 00:08:08.540 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:08.540 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.540 #2 INITED exec/s: 0 rss: 59Mb 00:08:08.540 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.540 This may also happen if the target rejected all inputs we tried so far 00:08:08.540 [2024-04-25 23:54:58.095195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.540 [2024-04-25 23:54:58.095238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.540 [2024-04-25 23:54:58.095358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.540 [2024-04-25 23:54:58.095381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.798 NEW_FUNC[1/664]: 0x4c8ed0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:08.799 NEW_FUNC[2/664]: 0x4d9b30 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.799 #8 NEW cov: 11572 ft: 11580 corp: 2/58b lim: 100 exec/s: 0 rss: 67Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:08:09.058 [2024-04-25 23:54:58.436248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.436297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.058 [2024-04-25 23:54:58.436430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.436460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.058 NEW_FUNC[1/1]: 0x12a9230 in nvmf_transport_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:723 00:08:09.058 #11 NEW cov: 11692 ft: 12299 corp: 3/99b lim: 100 exec/s: 0 rss: 67Mb L: 41/57 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:09.058 [2024-04-25 23:54:58.475996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.476030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.058 #12 NEW cov: 11698 ft: 13358 corp: 4/126b lim: 100 exec/s: 0 rss: 67Mb L: 27/57 MS: 1 EraseBytes- 00:08:09.058 [2024-04-25 23:54:58.526402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.526434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.058 [2024-04-25 23:54:58.526525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.526549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.058 #13 NEW cov: 11783 ft: 13595 corp: 5/169b lim: 100 exec/s: 0 rss: 67Mb L: 43/57 MS: 1 CMP- DE: "\000\000"- 00:08:09.058 [2024-04-25 23:54:58.566527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.566553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.058 [2024-04-25 23:54:58.566632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.566653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.058 #14 NEW cov: 11783 ft: 13669 corp: 6/213b lim: 100 exec/s: 0 rss: 67Mb L: 44/57 MS: 1 InsertByte- 00:08:09.058 [2024-04-25 23:54:58.606597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246952 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.606626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.058 [2024-04-25 23:54:58.606761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.606786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.058 #15 NEW cov: 11783 ft: 13778 corp: 7/270b lim: 100 exec/s: 0 rss: 67Mb L: 57/57 MS: 1 ChangeBinInt- 00:08:09.058 [2024-04-25 23:54:58.646761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.646791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.058 [2024-04-25 23:54:58.646899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.058 [2024-04-25 23:54:58.646918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.058 #16 NEW cov: 11783 ft: 13874 corp: 8/311b lim: 100 exec/s: 0 rss: 68Mb L: 41/57 MS: 1 CrossOver- 00:08:09.318 [2024-04-25 23:54:58.686584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.686610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.318 #17 NEW cov: 11783 ft: 13929 corp: 9/338b lim: 100 exec/s: 0 rss: 68Mb L: 27/57 MS: 1 ChangeBit- 00:08:09.318 [2024-04-25 23:54:58.736868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.736897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.318 #18 NEW cov: 11783 ft: 14002 corp: 10/365b lim: 100 exec/s: 0 rss: 68Mb L: 27/57 MS: 1 ChangeBinInt- 00:08:09.318 [2024-04-25 23:54:58.777185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.777210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.318 [2024-04-25 23:54:58.777340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.777363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.318 #19 NEW cov: 11783 ft: 14045 corp: 11/406b lim: 100 exec/s: 0 rss: 68Mb L: 41/57 MS: 1 ChangeBinInt- 00:08:09.318 [2024-04-25 23:54:58.817294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.817328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.318 [2024-04-25 23:54:58.817446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920095380334824 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.817472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.318 #20 NEW cov: 11783 ft: 14119 corp: 12/464b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 InsertByte- 00:08:09.318 [2024-04-25 23:54:58.857418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.857445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.318 [2024-04-25 23:54:58.857558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.857580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.318 #21 NEW cov: 11783 ft: 14165 corp: 13/507b lim: 100 exec/s: 0 rss: 68Mb L: 43/58 MS: 1 ChangeByte- 00:08:09.318 [2024-04-25 23:54:58.897761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.897790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.318 [2024-04-25 23:54:58.897895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.897917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.318 [2024-04-25 23:54:58.898031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3907518464 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.318 [2024-04-25 23:54:58.898054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.318 #22 NEW cov: 11783 ft: 14525 corp: 14/579b lim: 100 exec/s: 0 rss: 68Mb L: 72/72 MS: 1 CopyPart- 00:08:09.577 [2024-04-25 23:54:58.937708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070404243455 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.577 [2024-04-25 23:54:58.937739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.577 [2024-04-25 23:54:58.937842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.577 [2024-04-25 23:54:58.937863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.577 #23 NEW cov: 11783 ft: 14547 corp: 15/623b lim: 100 exec/s: 0 rss: 68Mb L: 44/72 MS: 1 ChangeBinInt- 00:08:09.577 [2024-04-25 23:54:58.977749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.577 [2024-04-25 23:54:58.977781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.577 [2024-04-25 23:54:58.977877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:32010391267840 len:7454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.577 [2024-04-25 23:54:58.977898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.577 NEW_FUNC[1/1]: 0x19788f0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.577 #24 NEW cov: 11806 ft: 14587 corp: 16/675b lim: 100 exec/s: 0 rss: 68Mb L: 52/72 MS: 1 InsertRepeatedBytes- 00:08:09.578 [2024-04-25 23:54:59.017698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.578 [2024-04-25 23:54:59.017727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.578 #25 NEW cov: 11806 ft: 14604 corp: 17/697b lim: 100 exec/s: 0 rss: 68Mb L: 22/72 MS: 1 EraseBytes- 00:08:09.578 [2024-04-25 23:54:59.057777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.578 [2024-04-25 23:54:59.057803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.578 #26 NEW cov: 11806 ft: 14625 corp: 18/724b lim: 100 exec/s: 26 rss: 68Mb L: 27/72 MS: 1 ChangeBit- 00:08:09.578 [2024-04-25 23:54:59.097836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.578 [2024-04-25 23:54:59.097868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.578 #27 NEW cov: 11806 ft: 14637 corp: 19/751b lim: 100 exec/s: 27 rss: 68Mb L: 27/72 MS: 1 ShuffleBytes- 00:08:09.578 [2024-04-25 23:54:59.138249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246952 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.578 [2024-04-25 23:54:59.138275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.578 [2024-04-25 23:54:59.138403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.578 [2024-04-25 23:54:59.138436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.578 #28 NEW cov: 11806 ft: 14656 corp: 20/810b lim: 100 exec/s: 28 rss: 68Mb L: 59/72 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:09.578 [2024-04-25 23:54:59.178140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.578 [2024-04-25 23:54:59.178166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 #29 NEW cov: 11806 ft: 14664 corp: 21/848b lim: 100 exec/s: 29 rss: 68Mb L: 38/72 MS: 1 CrossOver- 00:08:09.837 [2024-04-25 23:54:59.218435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073701097471 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.218466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 [2024-04-25 23:54:59.218589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.218610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.837 #33 NEW cov: 11806 ft: 14724 corp: 22/907b lim: 100 exec/s: 33 rss: 68Mb L: 59/72 MS: 4 ShuffleBytes-InsertByte-InsertByte-InsertRepeatedBytes- 00:08:09.837 [2024-04-25 23:54:59.258585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073701097471 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.258612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 [2024-04-25 23:54:59.258740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.258762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.837 #34 NEW cov: 11806 ft: 14740 corp: 23/966b lim: 100 exec/s: 34 rss: 68Mb L: 59/72 MS: 1 ChangeBit- 00:08:09.837 [2024-04-25 23:54:59.298656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.298690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 [2024-04-25 23:54:59.298800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920095380334824 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.298822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.837 #35 NEW cov: 11806 ft: 14761 corp: 24/1024b lim: 100 exec/s: 35 rss: 69Mb L: 58/72 MS: 1 ChangeByte- 00:08:09.837 [2024-04-25 23:54:59.338518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.338548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 #36 NEW cov: 11806 ft: 14765 corp: 25/1051b lim: 100 exec/s: 36 rss: 69Mb L: 27/72 MS: 1 ShuffleBytes- 00:08:09.837 [2024-04-25 23:54:59.378954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.378987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 [2024-04-25 23:54:59.379104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920094526210280 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.379128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.837 #37 NEW cov: 11806 ft: 14782 corp: 26/1109b lim: 100 exec/s: 37 rss: 69Mb L: 58/72 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:09.837 [2024-04-25 23:54:59.419071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.419105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.837 [2024-04-25 23:54:59.419231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098433788136 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.837 [2024-04-25 23:54:59.419253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.837 #38 NEW cov: 11806 ft: 14786 corp: 27/1166b lim: 100 exec/s: 38 rss: 69Mb L: 57/72 MS: 1 CopyPart- 00:08:10.096 [2024-04-25 23:54:59.459159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078534 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.459191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.096 [2024-04-25 23:54:59.459323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.459345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.096 #39 NEW cov: 11806 ft: 14855 corp: 28/1210b lim: 100 exec/s: 39 rss: 69Mb L: 44/72 MS: 1 ChangeBinInt- 00:08:10.096 [2024-04-25 23:54:59.499107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18086456104492990464 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.499132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.096 #40 NEW cov: 11806 ft: 14873 corp: 29/1237b lim: 100 exec/s: 40 rss: 69Mb L: 27/72 MS: 1 ChangeBinInt- 00:08:10.096 [2024-04-25 23:54:59.549419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16782920094709246184 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.549452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.096 [2024-04-25 23:54:59.549581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16782920098422778088 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.549606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.096 #41 NEW cov: 11806 ft: 14917 corp: 30/1295b lim: 100 exec/s: 41 rss: 69Mb L: 58/72 MS: 1 InsertByte- 00:08:10.096 [2024-04-25 23:54:59.599390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.599425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.096 #42 NEW cov: 11806 ft: 15003 corp: 31/1322b lim: 100 exec/s: 42 rss: 69Mb L: 27/72 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:10.096 [2024-04-25 23:54:59.639808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:9985 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.639837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.096 [2024-04-25 23:54:59.639934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:42 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.639958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.096 #43 NEW cov: 11806 ft: 15029 corp: 32/1367b lim: 100 exec/s: 43 rss: 69Mb L: 45/72 MS: 1 InsertByte- 00:08:10.096 [2024-04-25 23:54:59.680165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073701097471 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.680195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.096 [2024-04-25 23:54:59.680297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.680321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.096 [2024-04-25 23:54:59.680448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.096 [2024-04-25 23:54:59.680472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.096 #44 NEW cov: 11806 ft: 15066 corp: 33/1428b lim: 100 exec/s: 44 rss: 69Mb L: 61/72 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:10.355 [2024-04-25 23:54:59.719834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070907756543 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.719865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.355 #47 NEW cov: 11806 ft: 15078 corp: 34/1460b lim: 100 exec/s: 47 rss: 69Mb L: 32/72 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:10.355 [2024-04-25 23:54:59.760123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:47803 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.760154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.355 [2024-04-25 23:54:59.760268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13455272147882261178 len:47617 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.760306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.355 #48 NEW cov: 11806 ft: 15120 corp: 35/1508b lim: 100 exec/s: 48 rss: 69Mb L: 48/72 MS: 1 InsertRepeatedBytes- 00:08:10.355 [2024-04-25 23:54:59.800367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.800393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.355 [2024-04-25 23:54:59.800520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.800542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.355 #49 NEW cov: 11806 ft: 15122 corp: 36/1552b lim: 100 exec/s: 49 rss: 69Mb L: 44/72 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:10.355 [2024-04-25 23:54:59.850454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6293595036912670551 len:22360 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.850482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.355 [2024-04-25 23:54:59.850597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:1465334272 len:33 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.850617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.355 #50 NEW cov: 11806 ft: 15132 corp: 37/1601b lim: 100 exec/s: 50 rss: 69Mb L: 49/72 MS: 1 InsertRepeatedBytes- 00:08:10.355 [2024-04-25 23:54:59.900346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.900373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.355 #51 NEW cov: 11806 ft: 15173 corp: 38/1623b lim: 100 exec/s: 51 rss: 69Mb L: 22/72 MS: 1 ChangeBit- 00:08:10.355 [2024-04-25 23:54:59.940645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446181120450822143 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.940677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.355 [2024-04-25 23:54:59.940806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.355 [2024-04-25 23:54:59.940828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.614 #52 NEW cov: 11806 ft: 15181 corp: 39/1667b lim: 100 exec/s: 52 rss: 70Mb L: 44/72 MS: 1 ChangeBinInt- 00:08:10.614 [2024-04-25 23:54:59.980578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:39243 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.614 [2024-04-25 23:54:59.980604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.614 #53 NEW cov: 11806 ft: 15184 corp: 40/1694b lim: 100 exec/s: 53 rss: 70Mb L: 27/72 MS: 1 CMP- DE: "\024\223\231Jo\025w\000"- 00:08:10.614 [2024-04-25 23:55:00.020625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18392956962804334591 len:59625 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.614 [2024-04-25 23:55:00.020655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.614 #54 NEW cov: 11806 ft: 15198 corp: 41/1726b lim: 100 exec/s: 54 rss: 70Mb L: 32/72 MS: 1 CrossOver- 00:08:10.614 [2024-04-25 23:55:00.070896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:973078528 len:39243 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.614 [2024-04-25 23:55:00.070929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.614 #55 NEW cov: 11806 ft: 15212 corp: 42/1753b lim: 100 exec/s: 27 rss: 70Mb L: 27/72 MS: 1 CrossOver- 00:08:10.614 #55 DONE cov: 11806 ft: 15212 corp: 42/1753b lim: 100 exec/s: 27 rss: 70Mb 00:08:10.614 ###### Recommended dictionary. ###### 00:08:10.614 "\000\000" # Uses: 3 00:08:10.614 "\000\000\000\000" # Uses: 1 00:08:10.614 "\024\223\231Jo\025w\000" # Uses: 0 00:08:10.614 ###### End of recommended dictionary. ###### 00:08:10.614 Done 55 runs in 2 second(s) 00:08:10.614 23:55:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:10.614 23:55:00 -- ../common.sh@72 -- # (( i++ )) 00:08:10.614 23:55:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.614 23:55:00 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:10.614 00:08:10.614 real 1m2.928s 00:08:10.614 user 1m39.202s 00:08:10.614 sys 0m7.380s 00:08:10.614 23:55:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.614 23:55:00 -- common/autotest_common.sh@10 -- # set +x 00:08:10.614 ************************************ 00:08:10.614 END TEST nvmf_fuzz 00:08:10.614 ************************************ 00:08:10.875 23:55:00 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:10.875 23:55:00 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:10.875 23:55:00 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:10.875 23:55:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:10.875 23:55:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:10.875 23:55:00 -- common/autotest_common.sh@10 -- # set +x 00:08:10.875 ************************************ 00:08:10.875 START TEST vfio_fuzz 00:08:10.875 ************************************ 00:08:10.875 23:55:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:10.875 * Looking for test storage... 00:08:10.875 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:10.875 23:55:00 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:10.875 23:55:00 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:10.875 23:55:00 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:10.875 23:55:00 -- common/autotest_common.sh@34 -- # set -e 00:08:10.875 23:55:00 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:10.875 23:55:00 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:10.875 23:55:00 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:10.875 23:55:00 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:10.875 23:55:00 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:10.875 23:55:00 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:10.875 23:55:00 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:10.875 23:55:00 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:10.875 23:55:00 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:10.875 23:55:00 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:10.875 23:55:00 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:10.875 23:55:00 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:10.875 23:55:00 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:10.875 23:55:00 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:10.875 23:55:00 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:10.875 23:55:00 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:10.875 23:55:00 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:10.875 23:55:00 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:10.875 23:55:00 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:10.875 23:55:00 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:10.875 23:55:00 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:10.875 23:55:00 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:10.875 23:55:00 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:10.875 23:55:00 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:10.875 23:55:00 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:10.875 23:55:00 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:10.875 23:55:00 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:10.875 23:55:00 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:10.875 23:55:00 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:10.875 23:55:00 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:10.875 23:55:00 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:10.875 23:55:00 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:10.875 23:55:00 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:10.875 23:55:00 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:10.875 23:55:00 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:10.875 23:55:00 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:10.875 23:55:00 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:10.875 23:55:00 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:10.875 23:55:00 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:10.875 23:55:00 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:10.875 23:55:00 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:10.875 23:55:00 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:10.875 23:55:00 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:10.875 23:55:00 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:10.875 23:55:00 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:10.875 23:55:00 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:10.875 23:55:00 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:10.875 23:55:00 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:10.875 23:55:00 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:10.875 23:55:00 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:10.875 23:55:00 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:10.875 23:55:00 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:10.875 23:55:00 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:10.875 23:55:00 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:10.875 23:55:00 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:10.875 23:55:00 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:10.875 23:55:00 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:10.876 23:55:00 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:10.876 23:55:00 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:10.876 23:55:00 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:10.876 23:55:00 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:10.876 23:55:00 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:10.876 23:55:00 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:10.876 23:55:00 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=n 00:08:10.876 23:55:00 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:10.876 23:55:00 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:10.876 23:55:00 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:10.876 23:55:00 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:10.876 23:55:00 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:10.876 23:55:00 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:10.876 23:55:00 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:10.876 23:55:00 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:10.876 23:55:00 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:10.876 23:55:00 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:10.876 23:55:00 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:10.876 23:55:00 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:10.876 23:55:00 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:10.876 23:55:00 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:10.876 23:55:00 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:10.876 23:55:00 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:10.876 23:55:00 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:10.876 23:55:00 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:10.876 23:55:00 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:10.876 23:55:00 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:10.876 23:55:00 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:10.876 23:55:00 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:10.876 23:55:00 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:10.876 23:55:00 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:10.876 23:55:00 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:10.876 23:55:00 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:10.876 23:55:00 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:10.876 23:55:00 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:10.876 23:55:00 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:10.876 23:55:00 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:10.876 23:55:00 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:10.876 23:55:00 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:10.876 23:55:00 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:10.876 23:55:00 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:10.876 23:55:00 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:10.876 #define SPDK_CONFIG_H 00:08:10.876 #define SPDK_CONFIG_APPS 1 00:08:10.876 #define SPDK_CONFIG_ARCH native 00:08:10.876 #undef SPDK_CONFIG_ASAN 00:08:10.876 #undef SPDK_CONFIG_AVAHI 00:08:10.876 #undef SPDK_CONFIG_CET 00:08:10.876 #define SPDK_CONFIG_COVERAGE 1 00:08:10.876 #define SPDK_CONFIG_CROSS_PREFIX 00:08:10.876 #undef SPDK_CONFIG_CRYPTO 00:08:10.876 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:10.876 #undef SPDK_CONFIG_CUSTOMOCF 00:08:10.876 #undef SPDK_CONFIG_DAOS 00:08:10.876 #define SPDK_CONFIG_DAOS_DIR 00:08:10.876 #define SPDK_CONFIG_DEBUG 1 00:08:10.876 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:10.876 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:10.876 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:10.876 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:10.876 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:10.876 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:10.876 #define SPDK_CONFIG_EXAMPLES 1 00:08:10.876 #undef SPDK_CONFIG_FC 00:08:10.876 #define SPDK_CONFIG_FC_PATH 00:08:10.876 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:10.876 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:10.876 #undef SPDK_CONFIG_FUSE 00:08:10.876 #define SPDK_CONFIG_FUZZER 1 00:08:10.876 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:10.876 #undef SPDK_CONFIG_GOLANG 00:08:10.876 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:10.876 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:10.876 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:10.876 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:10.876 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:10.876 #define SPDK_CONFIG_IDXD 1 00:08:10.876 #undef SPDK_CONFIG_IDXD_KERNEL 00:08:10.876 #undef SPDK_CONFIG_IPSEC_MB 00:08:10.876 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:10.876 #define SPDK_CONFIG_ISAL 1 00:08:10.876 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:10.876 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:10.876 #define SPDK_CONFIG_LIBDIR 00:08:10.876 #undef SPDK_CONFIG_LTO 00:08:10.876 #define SPDK_CONFIG_MAX_LCORES 00:08:10.876 #define SPDK_CONFIG_NVME_CUSE 1 00:08:10.876 #undef SPDK_CONFIG_OCF 00:08:10.876 #define SPDK_CONFIG_OCF_PATH 00:08:10.876 #define SPDK_CONFIG_OPENSSL_PATH 00:08:10.876 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:10.876 #undef SPDK_CONFIG_PGO_USE 00:08:10.876 #define SPDK_CONFIG_PREFIX /usr/local 00:08:10.876 #undef SPDK_CONFIG_RAID5F 00:08:10.876 #undef SPDK_CONFIG_RBD 00:08:10.876 #define SPDK_CONFIG_RDMA 1 00:08:10.876 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:10.876 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:10.876 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:10.876 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:10.876 #undef SPDK_CONFIG_SHARED 00:08:10.876 #undef SPDK_CONFIG_SMA 00:08:10.876 #define SPDK_CONFIG_TESTS 1 00:08:10.876 #undef SPDK_CONFIG_TSAN 00:08:10.876 #define SPDK_CONFIG_UBLK 1 00:08:10.876 #define SPDK_CONFIG_UBSAN 1 00:08:10.876 #undef SPDK_CONFIG_UNIT_TESTS 00:08:10.876 #undef SPDK_CONFIG_URING 00:08:10.876 #define SPDK_CONFIG_URING_PATH 00:08:10.876 #undef SPDK_CONFIG_URING_ZNS 00:08:10.876 #undef SPDK_CONFIG_USDT 00:08:10.876 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:10.876 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:10.876 #define SPDK_CONFIG_VFIO_USER 1 00:08:10.876 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:10.876 #define SPDK_CONFIG_VHOST 1 00:08:10.876 #define SPDK_CONFIG_VIRTIO 1 00:08:10.876 #undef SPDK_CONFIG_VTUNE 00:08:10.876 #define SPDK_CONFIG_VTUNE_DIR 00:08:10.876 #define SPDK_CONFIG_WERROR 1 00:08:10.876 #define SPDK_CONFIG_WPDK_DIR 00:08:10.876 #undef SPDK_CONFIG_XNVME 00:08:10.876 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:10.876 23:55:00 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:10.876 23:55:00 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:10.876 23:55:00 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:10.876 23:55:00 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:10.876 23:55:00 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:10.876 23:55:00 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:10.876 23:55:00 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:10.876 23:55:00 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:10.876 23:55:00 -- paths/export.sh@5 -- # export PATH 00:08:10.876 23:55:00 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:10.876 23:55:00 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:10.876 23:55:00 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:10.876 23:55:00 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:10.876 23:55:00 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:10.876 23:55:00 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:10.876 23:55:00 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:10.876 23:55:00 -- pm/common@16 -- # TEST_TAG=N/A 00:08:10.876 23:55:00 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:10.876 23:55:00 -- common/autotest_common.sh@52 -- # : 1 00:08:10.876 23:55:00 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:10.876 23:55:00 -- common/autotest_common.sh@56 -- # : 0 00:08:10.876 23:55:00 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:10.876 23:55:00 -- common/autotest_common.sh@58 -- # : 0 00:08:10.876 23:55:00 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:10.877 23:55:00 -- common/autotest_common.sh@60 -- # : 1 00:08:10.877 23:55:00 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:10.877 23:55:00 -- common/autotest_common.sh@62 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:10.877 23:55:00 -- common/autotest_common.sh@64 -- # : 00:08:10.877 23:55:00 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:10.877 23:55:00 -- common/autotest_common.sh@66 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:10.877 23:55:00 -- common/autotest_common.sh@68 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:10.877 23:55:00 -- common/autotest_common.sh@70 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:10.877 23:55:00 -- common/autotest_common.sh@72 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:10.877 23:55:00 -- common/autotest_common.sh@74 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:10.877 23:55:00 -- common/autotest_common.sh@76 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:10.877 23:55:00 -- common/autotest_common.sh@78 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:10.877 23:55:00 -- common/autotest_common.sh@80 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:10.877 23:55:00 -- common/autotest_common.sh@82 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:10.877 23:55:00 -- common/autotest_common.sh@84 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:10.877 23:55:00 -- common/autotest_common.sh@86 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:10.877 23:55:00 -- common/autotest_common.sh@88 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:10.877 23:55:00 -- common/autotest_common.sh@90 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:10.877 23:55:00 -- common/autotest_common.sh@92 -- # : 1 00:08:10.877 23:55:00 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:10.877 23:55:00 -- common/autotest_common.sh@94 -- # : 1 00:08:10.877 23:55:00 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:10.877 23:55:00 -- common/autotest_common.sh@96 -- # : rdma 00:08:10.877 23:55:00 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:10.877 23:55:00 -- common/autotest_common.sh@98 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:10.877 23:55:00 -- common/autotest_common.sh@100 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:10.877 23:55:00 -- common/autotest_common.sh@102 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:10.877 23:55:00 -- common/autotest_common.sh@104 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:10.877 23:55:00 -- common/autotest_common.sh@106 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:10.877 23:55:00 -- common/autotest_common.sh@108 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:10.877 23:55:00 -- common/autotest_common.sh@110 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:10.877 23:55:00 -- common/autotest_common.sh@112 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:10.877 23:55:00 -- common/autotest_common.sh@114 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:10.877 23:55:00 -- common/autotest_common.sh@116 -- # : 1 00:08:10.877 23:55:00 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:10.877 23:55:00 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:10.877 23:55:00 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:10.877 23:55:00 -- common/autotest_common.sh@120 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:10.877 23:55:00 -- common/autotest_common.sh@122 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:10.877 23:55:00 -- common/autotest_common.sh@124 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:10.877 23:55:00 -- common/autotest_common.sh@126 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:10.877 23:55:00 -- common/autotest_common.sh@128 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:10.877 23:55:00 -- common/autotest_common.sh@130 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:10.877 23:55:00 -- common/autotest_common.sh@132 -- # : v23.11 00:08:10.877 23:55:00 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:10.877 23:55:00 -- common/autotest_common.sh@134 -- # : true 00:08:10.877 23:55:00 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:10.877 23:55:00 -- common/autotest_common.sh@136 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:10.877 23:55:00 -- common/autotest_common.sh@138 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:10.877 23:55:00 -- common/autotest_common.sh@140 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:10.877 23:55:00 -- common/autotest_common.sh@142 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:10.877 23:55:00 -- common/autotest_common.sh@144 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:10.877 23:55:00 -- common/autotest_common.sh@146 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:10.877 23:55:00 -- common/autotest_common.sh@148 -- # : 00:08:10.877 23:55:00 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:10.877 23:55:00 -- common/autotest_common.sh@150 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:10.877 23:55:00 -- common/autotest_common.sh@152 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:10.877 23:55:00 -- common/autotest_common.sh@154 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:10.877 23:55:00 -- common/autotest_common.sh@156 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:10.877 23:55:00 -- common/autotest_common.sh@158 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:10.877 23:55:00 -- common/autotest_common.sh@160 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:10.877 23:55:00 -- common/autotest_common.sh@163 -- # : 00:08:10.877 23:55:00 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:10.877 23:55:00 -- common/autotest_common.sh@165 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:10.877 23:55:00 -- common/autotest_common.sh@167 -- # : 0 00:08:10.877 23:55:00 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:10.877 23:55:00 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:10.877 23:55:00 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:10.877 23:55:00 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:10.877 23:55:00 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:10.877 23:55:00 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:10.878 23:55:00 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:10.878 23:55:00 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:10.878 23:55:00 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:10.878 23:55:00 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:10.878 23:55:00 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:10.878 23:55:00 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:10.878 23:55:00 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:10.878 23:55:00 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:10.878 23:55:00 -- common/autotest_common.sh@196 -- # cat 00:08:10.878 23:55:00 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:10.878 23:55:00 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:10.878 23:55:00 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:10.878 23:55:00 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:10.878 23:55:00 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:10.878 23:55:00 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:10.878 23:55:00 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:10.878 23:55:00 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:10.878 23:55:00 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:10.878 23:55:00 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:10.878 23:55:00 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:10.878 23:55:00 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:10.878 23:55:00 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:10.878 23:55:00 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:10.878 23:55:00 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:10.878 23:55:00 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:10.878 23:55:00 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:10.878 23:55:00 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:10.878 23:55:00 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:10.878 23:55:00 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:10.878 23:55:00 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:10.878 23:55:00 -- common/autotest_common.sh@249 -- # valgrind= 00:08:10.878 23:55:00 -- common/autotest_common.sh@255 -- # uname -s 00:08:10.878 23:55:00 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:10.878 23:55:00 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:10.878 23:55:00 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:10.878 23:55:00 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:10.878 23:55:00 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:10.878 23:55:00 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:10.878 23:55:00 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:10.878 23:55:00 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:10.878 23:55:00 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:10.878 23:55:00 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:10.878 23:55:00 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:10.878 23:55:00 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:10.878 23:55:00 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:10.878 23:55:00 -- common/autotest_common.sh@309 -- # [[ -z 482797 ]] 00:08:10.878 23:55:00 -- common/autotest_common.sh@309 -- # kill -0 482797 00:08:10.878 23:55:00 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:10.878 23:55:00 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:10.878 23:55:00 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:10.878 23:55:00 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:10.878 23:55:00 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:10.878 23:55:00 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:10.878 23:55:00 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:10.878 23:55:00 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:10.878 23:55:00 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.05pyAt 00:08:10.878 23:55:00 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:10.878 23:55:00 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:10.878 23:55:00 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:10.878 23:55:00 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.05pyAt/tests/vfio /tmp/spdk.05pyAt 00:08:10.878 23:55:00 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:10.878 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@318 -- # df -T 00:08:11.138 23:55:00 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=1052192768 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=4232237056 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=52978720768 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742297088 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=8763576320 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=30869889024 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871146496 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=1257472 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342480896 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348461056 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870966272 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871150592 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=184320 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:11.138 23:55:00 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:11.138 23:55:00 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:11.138 23:55:00 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:11.138 23:55:00 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:11.138 * Looking for test storage... 00:08:11.138 23:55:00 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:11.138 23:55:00 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:11.138 23:55:00 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:11.138 23:55:00 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:11.138 23:55:00 -- common/autotest_common.sh@363 -- # mount=/ 00:08:11.138 23:55:00 -- common/autotest_common.sh@365 -- # target_space=52978720768 00:08:11.138 23:55:00 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:11.138 23:55:00 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:11.138 23:55:00 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:11.138 23:55:00 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:11.138 23:55:00 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:11.138 23:55:00 -- common/autotest_common.sh@372 -- # new_size=10978168832 00:08:11.138 23:55:00 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:11.138 23:55:00 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:11.138 23:55:00 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:11.138 23:55:00 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:11.138 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:11.138 23:55:00 -- common/autotest_common.sh@380 -- # return 0 00:08:11.138 23:55:00 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:11.138 23:55:00 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:11.138 23:55:00 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:11.138 23:55:00 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:11.138 23:55:00 -- common/autotest_common.sh@1672 -- # true 00:08:11.138 23:55:00 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:11.138 23:55:00 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:11.138 23:55:00 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:11.138 23:55:00 -- common/autotest_common.sh@27 -- # exec 00:08:11.138 23:55:00 -- common/autotest_common.sh@29 -- # exec 00:08:11.138 23:55:00 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:11.138 23:55:00 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:11.138 23:55:00 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:11.138 23:55:00 -- common/autotest_common.sh@18 -- # set -x 00:08:11.138 23:55:00 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:11.138 23:55:00 -- ../common.sh@8 -- # pids=() 00:08:11.138 23:55:00 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:11.138 23:55:00 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:11.138 23:55:00 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:11.138 23:55:00 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:11.138 23:55:00 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:11.138 23:55:00 -- vfio/run.sh@65 -- # mem_size=0 00:08:11.138 23:55:00 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:11.138 23:55:00 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:11.138 23:55:00 -- ../common.sh@69 -- # local fuzz_num=7 00:08:11.138 23:55:00 -- ../common.sh@70 -- # local time=1 00:08:11.139 23:55:00 -- ../common.sh@72 -- # (( i = 0 )) 00:08:11.139 23:55:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.139 23:55:00 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:11.139 23:55:00 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:11.139 23:55:00 -- vfio/run.sh@23 -- # local timen=1 00:08:11.139 23:55:00 -- vfio/run.sh@24 -- # local core=0x1 00:08:11.139 23:55:00 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:11.139 23:55:00 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:11.139 23:55:00 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:11.139 23:55:00 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:11.139 23:55:00 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:11.139 23:55:00 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:11.139 23:55:00 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:11.139 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:11.139 23:55:00 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:11.139 [2024-04-25 23:55:00.569328] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:11.139 [2024-04-25 23:55:00.569434] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482834 ] 00:08:11.139 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.139 [2024-04-25 23:55:00.644822] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.139 [2024-04-25 23:55:00.685434] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:11.139 [2024-04-25 23:55:00.685582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.398 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.398 INFO: Seed: 3516919640 00:08:11.398 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:11.398 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:11.398 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:11.398 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.398 #2 INITED exec/s: 0 rss: 60Mb 00:08:11.398 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.398 This may also happen if the target rejected all inputs we tried so far 00:08:11.916 NEW_FUNC[1/597]: 0x49cfc0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:11.916 NEW_FUNC[2/597]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:11.916 #16 NEW cov: 10161 ft: 10675 corp: 2/49b lim: 60 exec/s: 0 rss: 66Mb L: 48/48 MS: 4 CrossOver-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:08:12.175 NEW_FUNC[1/24]: 0x164a490 in nvme_complete_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1365 00:08:12.175 NEW_FUNC[2/24]: 0x19e8b50 in bdev_writev_blocks_with_md /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/bdev/bdev.c:5547 00:08:12.175 #17 NEW cov: 10718 ft: 14574 corp: 3/104b lim: 60 exec/s: 0 rss: 67Mb L: 55/55 MS: 1 InsertRepeatedBytes- 00:08:12.175 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:12.175 #18 NEW cov: 10738 ft: 14851 corp: 4/161b lim: 60 exec/s: 0 rss: 68Mb L: 57/57 MS: 1 CopyPart- 00:08:12.434 #19 NEW cov: 10738 ft: 15479 corp: 5/216b lim: 60 exec/s: 19 rss: 68Mb L: 55/57 MS: 1 ChangeBit- 00:08:12.692 #20 NEW cov: 10738 ft: 15617 corp: 6/255b lim: 60 exec/s: 20 rss: 68Mb L: 39/57 MS: 1 EraseBytes- 00:08:12.692 #21 NEW cov: 10738 ft: 15989 corp: 7/272b lim: 60 exec/s: 21 rss: 68Mb L: 17/57 MS: 1 CrossOver- 00:08:12.951 #22 NEW cov: 10738 ft: 16142 corp: 8/328b lim: 60 exec/s: 22 rss: 69Mb L: 56/57 MS: 1 CopyPart- 00:08:13.210 #23 NEW cov: 10738 ft: 16152 corp: 9/370b lim: 60 exec/s: 23 rss: 69Mb L: 42/57 MS: 1 CrossOver- 00:08:13.469 #24 NEW cov: 10745 ft: 16564 corp: 10/425b lim: 60 exec/s: 24 rss: 69Mb L: 55/57 MS: 1 ShuffleBytes- 00:08:13.469 #25 NEW cov: 10745 ft: 16607 corp: 11/480b lim: 60 exec/s: 12 rss: 69Mb L: 55/57 MS: 1 CMP- DE: "\001}"- 00:08:13.469 #25 DONE cov: 10745 ft: 16607 corp: 11/480b lim: 60 exec/s: 12 rss: 69Mb 00:08:13.469 ###### Recommended dictionary. ###### 00:08:13.469 "\001}" # Uses: 0 00:08:13.469 ###### End of recommended dictionary. ###### 00:08:13.469 Done 25 runs in 2 second(s) 00:08:13.729 23:55:03 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:13.729 23:55:03 -- ../common.sh@72 -- # (( i++ )) 00:08:13.729 23:55:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.729 23:55:03 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:13.729 23:55:03 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:13.729 23:55:03 -- vfio/run.sh@23 -- # local timen=1 00:08:13.729 23:55:03 -- vfio/run.sh@24 -- # local core=0x1 00:08:13.729 23:55:03 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:13.729 23:55:03 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:13.729 23:55:03 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:13.729 23:55:03 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:13.729 23:55:03 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:13.729 23:55:03 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:13.729 23:55:03 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:13.729 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:13.729 23:55:03 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:13.729 [2024-04-25 23:55:03.312016] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:13.729 [2024-04-25 23:55:03.312112] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483383 ] 00:08:13.989 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.989 [2024-04-25 23:55:03.384914] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.989 [2024-04-25 23:55:03.420021] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.989 [2024-04-25 23:55:03.420167] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.989 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.989 INFO: Seed: 1954899615 00:08:14.247 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:14.247 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:14.247 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:14.247 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.247 #2 INITED exec/s: 0 rss: 60Mb 00:08:14.247 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:14.247 This may also happen if the target rejected all inputs we tried so far 00:08:14.247 [2024-04-25 23:55:03.696444] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:14.248 [2024-04-25 23:55:03.696478] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:14.248 [2024-04-25 23:55:03.696497] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:14.506 NEW_FUNC[1/628]: 0x49d560 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:14.506 NEW_FUNC[2/628]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:14.506 #4 NEW cov: 10718 ft: 10609 corp: 2/34b lim: 40 exec/s: 0 rss: 65Mb L: 33/33 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:14.765 [2024-04-25 23:55:04.174389] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:14.765 [2024-04-25 23:55:04.174433] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:14.765 [2024-04-25 23:55:04.174459] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:14.765 #14 NEW cov: 10732 ft: 13261 corp: 3/59b lim: 40 exec/s: 0 rss: 67Mb L: 25/33 MS: 5 ChangeByte-ShuffleBytes-CrossOver-CopyPart-InsertRepeatedBytes- 00:08:14.765 [2024-04-25 23:55:04.352543] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:14.765 [2024-04-25 23:55:04.352569] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:14.765 [2024-04-25 23:55:04.352588] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:15.041 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:15.041 #25 NEW cov: 10749 ft: 14820 corp: 4/88b lim: 40 exec/s: 0 rss: 68Mb L: 29/33 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:15.041 [2024-04-25 23:55:04.519219] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:15.041 [2024-04-25 23:55:04.519240] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:15.041 [2024-04-25 23:55:04.519259] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:15.041 #26 NEW cov: 10749 ft: 16243 corp: 5/125b lim: 40 exec/s: 26 rss: 68Mb L: 37/37 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:15.300 [2024-04-25 23:55:04.685527] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:15.300 [2024-04-25 23:55:04.685548] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:15.300 [2024-04-25 23:55:04.685567] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:15.300 #27 NEW cov: 10749 ft: 16342 corp: 6/159b lim: 40 exec/s: 27 rss: 68Mb L: 34/37 MS: 1 InsertByte- 00:08:15.300 [2024-04-25 23:55:04.851276] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:15.300 [2024-04-25 23:55:04.851297] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:15.300 [2024-04-25 23:55:04.851316] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:15.560 #28 NEW cov: 10749 ft: 16415 corp: 7/196b lim: 40 exec/s: 28 rss: 68Mb L: 37/37 MS: 1 ChangeBinInt- 00:08:15.560 [2024-04-25 23:55:05.016657] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:15.560 [2024-04-25 23:55:05.016678] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:15.560 [2024-04-25 23:55:05.016696] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:15.560 #29 NEW cov: 10749 ft: 16509 corp: 8/229b lim: 40 exec/s: 29 rss: 68Mb L: 33/37 MS: 1 ChangeBit- 00:08:15.818 [2024-04-25 23:55:05.183529] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:15.819 [2024-04-25 23:55:05.183550] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:15.819 [2024-04-25 23:55:05.183569] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:15.819 #30 NEW cov: 10749 ft: 16727 corp: 9/255b lim: 40 exec/s: 30 rss: 68Mb L: 26/37 MS: 1 InsertByte- 00:08:15.819 [2024-04-25 23:55:05.348215] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:15.819 [2024-04-25 23:55:05.348237] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:15.819 [2024-04-25 23:55:05.348256] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:16.077 #31 NEW cov: 10756 ft: 16909 corp: 10/289b lim: 40 exec/s: 31 rss: 68Mb L: 34/37 MS: 1 ChangeByte- 00:08:16.077 [2024-04-25 23:55:05.513136] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:16.077 [2024-04-25 23:55:05.513157] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:16.077 [2024-04-25 23:55:05.513176] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:16.077 #33 NEW cov: 10756 ft: 17267 corp: 11/314b lim: 40 exec/s: 16 rss: 68Mb L: 25/37 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:16.077 #33 DONE cov: 10756 ft: 17267 corp: 11/314b lim: 40 exec/s: 16 rss: 68Mb 00:08:16.077 ###### Recommended dictionary. ###### 00:08:16.077 "\002\000\000\000" # Uses: 0 00:08:16.077 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:16.077 ###### End of recommended dictionary. ###### 00:08:16.077 Done 33 runs in 2 second(s) 00:08:16.336 23:55:05 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:16.336 23:55:05 -- ../common.sh@72 -- # (( i++ )) 00:08:16.336 23:55:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.336 23:55:05 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:16.336 23:55:05 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:16.336 23:55:05 -- vfio/run.sh@23 -- # local timen=1 00:08:16.336 23:55:05 -- vfio/run.sh@24 -- # local core=0x1 00:08:16.336 23:55:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:16.336 23:55:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:16.336 23:55:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:16.336 23:55:05 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:16.336 23:55:05 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:16.336 23:55:05 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:16.336 23:55:05 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:16.336 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:16.336 23:55:05 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:16.336 [2024-04-25 23:55:05.913122] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:16.336 [2024-04-25 23:55:05.913192] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483863 ] 00:08:16.336 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.596 [2024-04-25 23:55:05.984004] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.596 [2024-04-25 23:55:06.019246] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:16.596 [2024-04-25 23:55:06.019415] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.596 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.596 INFO: Seed: 263931056 00:08:16.855 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:16.855 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:16.855 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:16.855 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.855 #2 INITED exec/s: 0 rss: 60Mb 00:08:16.855 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.855 This may also happen if the target rejected all inputs we tried so far 00:08:16.855 [2024-04-25 23:55:06.330410] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:17.114 NEW_FUNC[1/626]: 0x49df40 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:17.114 NEW_FUNC[2/626]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:17.114 #5 NEW cov: 10699 ft: 10362 corp: 2/76b lim: 80 exec/s: 0 rss: 66Mb L: 75/75 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:17.373 [2024-04-25 23:55:06.810605] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:17.373 #6 NEW cov: 10713 ft: 13134 corp: 3/155b lim: 80 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:17.631 [2024-04-25 23:55:07.008495] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:17.631 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.631 #7 NEW cov: 10733 ft: 14402 corp: 4/235b lim: 80 exec/s: 0 rss: 68Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:08:17.631 [2024-04-25 23:55:07.211295] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:17.890 #8 NEW cov: 10733 ft: 14703 corp: 5/277b lim: 80 exec/s: 8 rss: 68Mb L: 42/80 MS: 1 EraseBytes- 00:08:17.890 [2024-04-25 23:55:07.412610] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:18.149 #9 NEW cov: 10733 ft: 15553 corp: 6/354b lim: 80 exec/s: 9 rss: 68Mb L: 77/80 MS: 1 CMP- DE: "\020\000"- 00:08:18.149 [2024-04-25 23:55:07.610595] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:18.149 #10 NEW cov: 10733 ft: 15830 corp: 7/434b lim: 80 exec/s: 10 rss: 68Mb L: 80/80 MS: 1 CopyPart- 00:08:18.407 [2024-04-25 23:55:07.801374] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:18.408 #11 NEW cov: 10733 ft: 16658 corp: 8/514b lim: 80 exec/s: 11 rss: 69Mb L: 80/80 MS: 1 PersAutoDict- DE: "\020\000"- 00:08:18.408 [2024-04-25 23:55:07.999808] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:18.666 #12 NEW cov: 10740 ft: 16749 corp: 9/579b lim: 80 exec/s: 12 rss: 69Mb L: 65/80 MS: 1 InsertRepeatedBytes- 00:08:18.666 [2024-04-25 23:55:08.198202] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:18.925 #13 NEW cov: 10740 ft: 16764 corp: 10/622b lim: 80 exec/s: 6 rss: 69Mb L: 43/80 MS: 1 EraseBytes- 00:08:18.925 #13 DONE cov: 10740 ft: 16764 corp: 10/622b lim: 80 exec/s: 6 rss: 69Mb 00:08:18.925 ###### Recommended dictionary. ###### 00:08:18.925 "\020\000" # Uses: 1 00:08:18.925 ###### End of recommended dictionary. ###### 00:08:18.925 Done 13 runs in 2 second(s) 00:08:19.183 23:55:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:19.183 23:55:08 -- ../common.sh@72 -- # (( i++ )) 00:08:19.183 23:55:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.183 23:55:08 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:19.183 23:55:08 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:19.183 23:55:08 -- vfio/run.sh@23 -- # local timen=1 00:08:19.183 23:55:08 -- vfio/run.sh@24 -- # local core=0x1 00:08:19.183 23:55:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:19.183 23:55:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:19.183 23:55:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:19.183 23:55:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:19.183 23:55:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:19.183 23:55:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:19.183 23:55:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:19.183 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:19.183 23:55:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:19.183 [2024-04-25 23:55:08.596054] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:19.183 [2024-04-25 23:55:08.596113] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484226 ] 00:08:19.183 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.183 [2024-04-25 23:55:08.665073] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.183 [2024-04-25 23:55:08.700352] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:19.183 [2024-04-25 23:55:08.700520] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.441 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.441 INFO: Seed: 2949941937 00:08:19.441 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:19.441 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:19.441 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:19.441 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.441 #2 INITED exec/s: 0 rss: 60Mb 00:08:19.441 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.441 This may also happen if the target rejected all inputs we tried so far 00:08:19.958 NEW_FUNC[1/622]: 0x49e620 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:19.958 NEW_FUNC[2/622]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:19.958 #6 NEW cov: 10686 ft: 10510 corp: 2/39b lim: 320 exec/s: 0 rss: 65Mb L: 38/38 MS: 4 ChangeBinInt-InsertRepeatedBytes-CrossOver-CopyPart- 00:08:19.958 #12 NEW cov: 10702 ft: 14052 corp: 3/77b lim: 320 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 CrossOver- 00:08:20.216 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:20.216 #18 NEW cov: 10721 ft: 14834 corp: 4/115b lim: 320 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ShuffleBytes- 00:08:20.474 #24 NEW cov: 10721 ft: 15193 corp: 5/153b lim: 320 exec/s: 24 rss: 68Mb L: 38/38 MS: 1 ShuffleBytes- 00:08:20.732 #27 NEW cov: 10721 ft: 15293 corp: 6/208b lim: 320 exec/s: 27 rss: 68Mb L: 55/55 MS: 3 EraseBytes-ChangeByte-CrossOver- 00:08:20.732 #33 NEW cov: 10721 ft: 15379 corp: 7/242b lim: 320 exec/s: 33 rss: 68Mb L: 34/55 MS: 1 EraseBytes- 00:08:20.990 #34 NEW cov: 10721 ft: 15417 corp: 8/314b lim: 320 exec/s: 34 rss: 68Mb L: 72/72 MS: 1 CopyPart- 00:08:21.248 #35 NEW cov: 10721 ft: 15457 corp: 9/352b lim: 320 exec/s: 35 rss: 68Mb L: 38/72 MS: 1 ShuffleBytes- 00:08:21.248 #36 NEW cov: 10728 ft: 15687 corp: 10/424b lim: 320 exec/s: 36 rss: 68Mb L: 72/72 MS: 1 ShuffleBytes- 00:08:21.506 #37 NEW cov: 10728 ft: 16377 corp: 11/520b lim: 320 exec/s: 18 rss: 68Mb L: 96/96 MS: 1 CrossOver- 00:08:21.506 #37 DONE cov: 10728 ft: 16377 corp: 11/520b lim: 320 exec/s: 18 rss: 68Mb 00:08:21.506 Done 37 runs in 2 second(s) 00:08:21.764 23:55:11 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:21.764 23:55:11 -- ../common.sh@72 -- # (( i++ )) 00:08:21.764 23:55:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.764 23:55:11 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:21.764 23:55:11 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:21.764 23:55:11 -- vfio/run.sh@23 -- # local timen=1 00:08:21.764 23:55:11 -- vfio/run.sh@24 -- # local core=0x1 00:08:21.764 23:55:11 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:21.764 23:55:11 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:21.764 23:55:11 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:21.764 23:55:11 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:21.764 23:55:11 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:21.764 23:55:11 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:21.764 23:55:11 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:21.764 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:21.764 23:55:11 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:21.764 [2024-04-25 23:55:11.305487] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:21.764 [2024-04-25 23:55:11.305560] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484759 ] 00:08:21.764 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.023 [2024-04-25 23:55:11.377322] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.023 [2024-04-25 23:55:11.412289] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.023 [2024-04-25 23:55:11.412471] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.023 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.023 INFO: Seed: 1360967027 00:08:22.023 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:22.023 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:22.023 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:22.023 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.023 #2 INITED exec/s: 0 rss: 60Mb 00:08:22.023 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.023 This may also happen if the target rejected all inputs we tried so far 00:08:22.540 NEW_FUNC[1/622]: 0x49eea0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:22.540 NEW_FUNC[2/622]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:22.540 #7 NEW cov: 10693 ft: 10592 corp: 2/114b lim: 320 exec/s: 0 rss: 65Mb L: 113/113 MS: 5 InsertByte-ChangeBit-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:22.797 #30 NEW cov: 10710 ft: 12825 corp: 3/202b lim: 320 exec/s: 0 rss: 67Mb L: 88/113 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:22.797 #31 NEW cov: 10710 ft: 13741 corp: 4/290b lim: 320 exec/s: 0 rss: 67Mb L: 88/113 MS: 1 ChangeBit- 00:08:22.797 #32 NEW cov: 10710 ft: 14423 corp: 5/403b lim: 320 exec/s: 0 rss: 67Mb L: 113/113 MS: 1 CopyPart- 00:08:23.055 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.055 #33 NEW cov: 10727 ft: 14853 corp: 6/520b lim: 320 exec/s: 0 rss: 68Mb L: 117/117 MS: 1 CMP- DE: "\377\377\377a"- 00:08:23.055 #34 NEW cov: 10727 ft: 15225 corp: 7/686b lim: 320 exec/s: 0 rss: 68Mb L: 166/166 MS: 1 CopyPart- 00:08:23.312 #35 NEW cov: 10727 ft: 15930 corp: 8/799b lim: 320 exec/s: 35 rss: 68Mb L: 113/166 MS: 1 ChangeBinInt- 00:08:23.570 #36 NEW cov: 10727 ft: 16249 corp: 9/904b lim: 320 exec/s: 36 rss: 68Mb L: 105/166 MS: 1 EraseBytes- 00:08:23.570 #37 NEW cov: 10727 ft: 16544 corp: 10/974b lim: 320 exec/s: 37 rss: 68Mb L: 70/166 MS: 1 EraseBytes- 00:08:23.828 #38 NEW cov: 10727 ft: 16676 corp: 11/1033b lim: 320 exec/s: 38 rss: 68Mb L: 59/166 MS: 1 EraseBytes- 00:08:24.086 #39 NEW cov: 10734 ft: 16810 corp: 12/1144b lim: 320 exec/s: 39 rss: 68Mb L: 111/166 MS: 1 CrossOver- 00:08:24.344 #40 NEW cov: 10734 ft: 16915 corp: 13/1261b lim: 320 exec/s: 20 rss: 68Mb L: 117/166 MS: 1 CopyPart- 00:08:24.344 #40 DONE cov: 10734 ft: 16915 corp: 13/1261b lim: 320 exec/s: 20 rss: 68Mb 00:08:24.344 ###### Recommended dictionary. ###### 00:08:24.344 "\377\377\377a" # Uses: 0 00:08:24.344 ###### End of recommended dictionary. ###### 00:08:24.344 Done 40 runs in 2 second(s) 00:08:24.603 23:55:13 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:24.603 23:55:13 -- ../common.sh@72 -- # (( i++ )) 00:08:24.603 23:55:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.603 23:55:13 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:24.603 23:55:13 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:24.603 23:55:13 -- vfio/run.sh@23 -- # local timen=1 00:08:24.603 23:55:13 -- vfio/run.sh@24 -- # local core=0x1 00:08:24.603 23:55:13 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:24.603 23:55:13 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:24.603 23:55:13 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:24.603 23:55:13 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:24.603 23:55:13 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:24.603 23:55:13 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:24.603 23:55:13 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:24.603 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:24.603 23:55:13 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:24.603 [2024-04-25 23:55:14.017224] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:24.603 [2024-04-25 23:55:14.017319] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485303 ] 00:08:24.603 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.603 [2024-04-25 23:55:14.089832] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.603 [2024-04-25 23:55:14.124852] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:24.603 [2024-04-25 23:55:14.124996] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.862 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.862 INFO: Seed: 4076963591 00:08:24.862 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:24.862 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:24.862 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:24.862 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.862 #2 INITED exec/s: 0 rss: 60Mb 00:08:24.862 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.862 This may also happen if the target rejected all inputs we tried so far 00:08:24.862 [2024-04-25 23:55:14.418462] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:24.862 [2024-04-25 23:55:14.418506] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.380 NEW_FUNC[1/617]: 0x49f8a0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:25.380 NEW_FUNC[2/617]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:25.380 #14 NEW cov: 10531 ft: 10648 corp: 2/85b lim: 120 exec/s: 0 rss: 66Mb L: 84/84 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:25.380 [2024-04-25 23:55:14.884443] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.380 [2024-04-25 23:55:14.884498] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.638 NEW_FUNC[1/11]: 0x12762c0 in nvmf_transport_req_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:735 00:08:25.638 NEW_FUNC[2/11]: 0x1340760 in nvmf_vfio_user_req_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5306 00:08:25.638 #15 NEW cov: 10734 ft: 14414 corp: 3/150b lim: 120 exec/s: 0 rss: 67Mb L: 65/84 MS: 1 EraseBytes- 00:08:25.638 [2024-04-25 23:55:15.079293] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.639 [2024-04-25 23:55:15.079324] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.639 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.639 #16 NEW cov: 10751 ft: 15225 corp: 4/214b lim: 120 exec/s: 0 rss: 68Mb L: 64/84 MS: 1 EraseBytes- 00:08:25.897 [2024-04-25 23:55:15.265995] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.897 [2024-04-25 23:55:15.266024] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.897 #17 NEW cov: 10751 ft: 15819 corp: 5/245b lim: 120 exec/s: 17 rss: 68Mb L: 31/84 MS: 1 CrossOver- 00:08:25.897 [2024-04-25 23:55:15.462823] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.897 [2024-04-25 23:55:15.462854] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.156 #23 NEW cov: 10751 ft: 16131 corp: 6/277b lim: 120 exec/s: 23 rss: 68Mb L: 32/84 MS: 1 CrossOver- 00:08:26.156 [2024-04-25 23:55:15.647651] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.156 [2024-04-25 23:55:15.647682] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.156 #24 NEW cov: 10751 ft: 16495 corp: 7/342b lim: 120 exec/s: 24 rss: 68Mb L: 65/84 MS: 1 ChangeByte- 00:08:26.414 [2024-04-25 23:55:15.833613] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.414 [2024-04-25 23:55:15.833641] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.414 #25 NEW cov: 10751 ft: 16848 corp: 8/374b lim: 120 exec/s: 25 rss: 68Mb L: 32/84 MS: 1 ChangeBit- 00:08:26.414 [2024-04-25 23:55:16.019913] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.414 [2024-04-25 23:55:16.019942] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.672 #26 NEW cov: 10758 ft: 17011 corp: 9/405b lim: 120 exec/s: 26 rss: 68Mb L: 31/84 MS: 1 CopyPart- 00:08:26.672 [2024-04-25 23:55:16.203734] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.672 [2024-04-25 23:55:16.203763] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.930 #27 NEW cov: 10758 ft: 17094 corp: 10/436b lim: 120 exec/s: 27 rss: 68Mb L: 31/84 MS: 1 CrossOver- 00:08:26.930 [2024-04-25 23:55:16.389572] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.930 [2024-04-25 23:55:16.389602] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.930 #28 NEW cov: 10758 ft: 17240 corp: 11/530b lim: 120 exec/s: 14 rss: 68Mb L: 94/94 MS: 1 CrossOver- 00:08:26.930 #28 DONE cov: 10758 ft: 17240 corp: 11/530b lim: 120 exec/s: 14 rss: 68Mb 00:08:26.931 Done 28 runs in 2 second(s) 00:08:27.189 23:55:16 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:27.189 23:55:16 -- ../common.sh@72 -- # (( i++ )) 00:08:27.189 23:55:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.189 23:55:16 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:27.189 23:55:16 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:27.189 23:55:16 -- vfio/run.sh@23 -- # local timen=1 00:08:27.189 23:55:16 -- vfio/run.sh@24 -- # local core=0x1 00:08:27.189 23:55:16 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:27.189 23:55:16 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:27.189 23:55:16 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:27.189 23:55:16 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:27.189 23:55:16 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:27.189 23:55:16 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:27.189 23:55:16 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:27.189 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:27.189 23:55:16 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:27.189 [2024-04-25 23:55:16.794196] Starting SPDK v24.01.1-pre git sha1 36faa8c31 / DPDK 23.11.0 initialization... 00:08:27.189 [2024-04-25 23:55:16.794266] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485849 ] 00:08:27.448 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.448 [2024-04-25 23:55:16.866025] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.448 [2024-04-25 23:55:16.900850] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:27.448 [2024-04-25 23:55:16.900995] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.707 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.707 INFO: Seed: 2552995197 00:08:27.707 INFO: Loaded 1 modules (338449 inline 8-bit counters): 338449 [0x2669dcc, 0x26bc7dd), 00:08:27.707 INFO: Loaded 1 PC tables (338449 PCs): 338449 [0x26bc7e0,0x2be68f0), 00:08:27.707 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:27.707 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.707 #2 INITED exec/s: 0 rss: 60Mb 00:08:27.707 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.707 This may also happen if the target rejected all inputs we tried so far 00:08:27.707 [2024-04-25 23:55:17.151451] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:27.707 [2024-04-25 23:55:17.151492] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:27.965 NEW_FUNC[1/628]: 0x4a0590 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:27.966 NEW_FUNC[2/628]: 0x4a2b60 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:27.966 #7 NEW cov: 10716 ft: 10670 corp: 2/84b lim: 90 exec/s: 0 rss: 65Mb L: 83/83 MS: 5 ChangeBinInt-ChangeBinInt-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:27.966 [2024-04-25 23:55:17.562700] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:27.966 [2024-04-25 23:55:17.562745] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.224 #12 NEW cov: 10730 ft: 13575 corp: 3/140b lim: 90 exec/s: 0 rss: 67Mb L: 56/83 MS: 5 ChangeBit-CopyPart-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:28.224 [2024-04-25 23:55:17.687560] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.224 [2024-04-25 23:55:17.687595] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.224 #13 NEW cov: 10730 ft: 14694 corp: 4/196b lim: 90 exec/s: 0 rss: 67Mb L: 56/83 MS: 1 ShuffleBytes- 00:08:28.224 [2024-04-25 23:55:17.810381] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.224 [2024-04-25 23:55:17.810422] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.483 #14 NEW cov: 10730 ft: 15032 corp: 5/253b lim: 90 exec/s: 0 rss: 67Mb L: 57/83 MS: 1 InsertByte- 00:08:28.483 [2024-04-25 23:55:17.926199] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.483 [2024-04-25 23:55:17.926235] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.483 NEW_FUNC[1/1]: 0x1945090 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.483 #15 NEW cov: 10747 ft: 15522 corp: 6/310b lim: 90 exec/s: 0 rss: 68Mb L: 57/83 MS: 1 ChangeBinInt- 00:08:28.483 [2024-04-25 23:55:18.050109] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.483 [2024-04-25 23:55:18.050143] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.742 #16 NEW cov: 10747 ft: 15639 corp: 7/367b lim: 90 exec/s: 16 rss: 68Mb L: 57/83 MS: 1 ChangeByte- 00:08:28.742 [2024-04-25 23:55:18.163779] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.742 [2024-04-25 23:55:18.163814] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.742 #17 NEW cov: 10747 ft: 15832 corp: 8/424b lim: 90 exec/s: 17 rss: 68Mb L: 57/83 MS: 1 ChangeByte- 00:08:28.742 [2024-04-25 23:55:18.278676] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:28.742 [2024-04-25 23:55:18.278713] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:28.742 #18 NEW cov: 10747 ft: 15888 corp: 9/487b lim: 90 exec/s: 18 rss: 68Mb L: 63/83 MS: 1 CopyPart- 00:08:29.000 [2024-04-25 23:55:18.402281] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.000 [2024-04-25 23:55:18.402314] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.000 #19 NEW cov: 10747 ft: 15972 corp: 10/544b lim: 90 exec/s: 19 rss: 68Mb L: 57/83 MS: 1 CopyPart- 00:08:29.000 [2024-04-25 23:55:18.505965] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.000 [2024-04-25 23:55:18.505998] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.000 #20 NEW cov: 10747 ft: 16245 corp: 11/616b lim: 90 exec/s: 20 rss: 68Mb L: 72/83 MS: 1 EraseBytes- 00:08:29.259 [2024-04-25 23:55:18.629803] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.259 [2024-04-25 23:55:18.629837] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.260 #21 NEW cov: 10747 ft: 16377 corp: 12/679b lim: 90 exec/s: 21 rss: 68Mb L: 63/83 MS: 1 ChangeBinInt- 00:08:29.260 [2024-04-25 23:55:18.743648] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.260 [2024-04-25 23:55:18.743681] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.260 #27 NEW cov: 10747 ft: 16671 corp: 13/714b lim: 90 exec/s: 27 rss: 68Mb L: 35/83 MS: 1 CrossOver- 00:08:29.260 [2024-04-25 23:55:18.857424] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.260 [2024-04-25 23:55:18.857455] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.518 #33 NEW cov: 10754 ft: 16712 corp: 14/804b lim: 90 exec/s: 33 rss: 68Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:29.518 [2024-04-25 23:55:18.971172] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.518 [2024-04-25 23:55:18.971204] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.518 #34 NEW cov: 10754 ft: 16837 corp: 15/859b lim: 90 exec/s: 34 rss: 68Mb L: 55/90 MS: 1 EraseBytes- 00:08:29.518 [2024-04-25 23:55:19.084947] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:29.518 [2024-04-25 23:55:19.084980] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:29.777 #35 NEW cov: 10754 ft: 17253 corp: 16/922b lim: 90 exec/s: 17 rss: 68Mb L: 63/90 MS: 1 CopyPart- 00:08:29.777 #35 DONE cov: 10754 ft: 17253 corp: 16/922b lim: 90 exec/s: 17 rss: 68Mb 00:08:29.777 Done 35 runs in 2 second(s) 00:08:30.036 23:55:19 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:30.036 23:55:19 -- ../common.sh@72 -- # (( i++ )) 00:08:30.036 23:55:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.036 23:55:19 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:30.036 00:08:30.036 real 0m19.157s 00:08:30.036 user 0m26.835s 00:08:30.036 sys 0m1.800s 00:08:30.036 23:55:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.036 23:55:19 -- common/autotest_common.sh@10 -- # set +x 00:08:30.036 ************************************ 00:08:30.036 END TEST vfio_fuzz 00:08:30.036 ************************************ 00:08:30.036 23:55:19 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:30.036 00:08:30.036 real 1m22.280s 00:08:30.036 user 2m6.109s 00:08:30.036 sys 0m9.328s 00:08:30.036 23:55:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.036 23:55:19 -- common/autotest_common.sh@10 -- # set +x 00:08:30.036 ************************************ 00:08:30.036 END TEST llvm_fuzz 00:08:30.036 ************************************ 00:08:30.036 23:55:19 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:30.036 23:55:19 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:30.036 23:55:19 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:30.036 23:55:19 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:30.036 23:55:19 -- common/autotest_common.sh@10 -- # set +x 00:08:30.036 23:55:19 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:30.036 23:55:19 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:30.036 23:55:19 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:30.036 23:55:19 -- common/autotest_common.sh@10 -- # set +x 00:08:36.623 INFO: APP EXITING 00:08:36.623 INFO: killing all VMs 00:08:36.623 INFO: killing vhost app 00:08:36.623 INFO: EXIT DONE 00:08:39.159 Waiting for block devices as requested 00:08:39.159 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:39.419 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:39.419 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:39.419 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:39.679 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:39.679 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:39.679 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:39.679 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:39.938 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:39.938 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:39.938 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:40.198 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:40.198 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:40.198 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:40.457 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:40.457 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:40.458 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:44.651 Cleaning 00:08:44.651 Removing: /dev/shm/spdk_tgt_trace.pid449115 00:08:44.651 Removing: /var/run/dpdk/spdk_pid446654 00:08:44.651 Removing: /var/run/dpdk/spdk_pid447910 00:08:44.651 Removing: /var/run/dpdk/spdk_pid449115 00:08:44.651 Removing: /var/run/dpdk/spdk_pid449814 00:08:44.651 Removing: /var/run/dpdk/spdk_pid450093 00:08:44.651 Removing: /var/run/dpdk/spdk_pid450397 00:08:44.651 Removing: /var/run/dpdk/spdk_pid450730 00:08:44.651 Removing: /var/run/dpdk/spdk_pid451013 00:08:44.651 Removing: /var/run/dpdk/spdk_pid451229 00:08:44.651 Removing: /var/run/dpdk/spdk_pid451512 00:08:44.651 Removing: /var/run/dpdk/spdk_pid451824 00:08:44.651 Removing: /var/run/dpdk/spdk_pid452559 00:08:44.651 Removing: /var/run/dpdk/spdk_pid455611 00:08:44.651 Removing: /var/run/dpdk/spdk_pid455927 00:08:44.651 Removing: /var/run/dpdk/spdk_pid456245 00:08:44.651 Removing: /var/run/dpdk/spdk_pid456493 00:08:44.651 Removing: /var/run/dpdk/spdk_pid457067 00:08:44.651 Removing: /var/run/dpdk/spdk_pid457131 00:08:44.651 Removing: /var/run/dpdk/spdk_pid457656 00:08:44.651 Removing: /var/run/dpdk/spdk_pid457924 00:08:44.651 Removing: /var/run/dpdk/spdk_pid458218 00:08:44.651 Removing: /var/run/dpdk/spdk_pid458241 00:08:44.651 Removing: /var/run/dpdk/spdk_pid458533 00:08:44.651 Removing: /var/run/dpdk/spdk_pid458657 00:08:44.651 Removing: /var/run/dpdk/spdk_pid459168 00:08:44.651 Removing: /var/run/dpdk/spdk_pid459452 00:08:44.651 Removing: /var/run/dpdk/spdk_pid459598 00:08:44.651 Removing: /var/run/dpdk/spdk_pid459808 00:08:44.651 Removing: /var/run/dpdk/spdk_pid460108 00:08:44.651 Removing: /var/run/dpdk/spdk_pid460138 00:08:44.651 Removing: /var/run/dpdk/spdk_pid460270 00:08:44.651 Removing: /var/run/dpdk/spdk_pid460461 00:08:44.651 Removing: /var/run/dpdk/spdk_pid460750 00:08:44.651 Removing: /var/run/dpdk/spdk_pid461017 00:08:44.651 Removing: /var/run/dpdk/spdk_pid461305 00:08:44.651 Removing: /var/run/dpdk/spdk_pid461553 00:08:44.651 Removing: /var/run/dpdk/spdk_pid461744 00:08:44.651 Removing: /var/run/dpdk/spdk_pid461889 00:08:44.651 Removing: /var/run/dpdk/spdk_pid462160 00:08:44.651 Removing: /var/run/dpdk/spdk_pid462434 00:08:44.651 Removing: /var/run/dpdk/spdk_pid462717 00:08:44.651 Removing: /var/run/dpdk/spdk_pid462988 00:08:44.651 Removing: /var/run/dpdk/spdk_pid463269 00:08:44.651 Removing: /var/run/dpdk/spdk_pid463427 00:08:44.651 Removing: /var/run/dpdk/spdk_pid463600 00:08:44.651 Removing: /var/run/dpdk/spdk_pid463845 00:08:44.651 Removing: /var/run/dpdk/spdk_pid464132 00:08:44.651 Removing: /var/run/dpdk/spdk_pid464400 00:08:44.651 Removing: /var/run/dpdk/spdk_pid464686 00:08:44.651 Removing: /var/run/dpdk/spdk_pid464905 00:08:44.651 Removing: /var/run/dpdk/spdk_pid465101 00:08:44.651 Removing: /var/run/dpdk/spdk_pid465264 00:08:44.651 Removing: /var/run/dpdk/spdk_pid465547 00:08:44.651 Removing: /var/run/dpdk/spdk_pid465819 00:08:44.651 Removing: /var/run/dpdk/spdk_pid466102 00:08:44.651 Removing: /var/run/dpdk/spdk_pid466373 00:08:44.651 Removing: /var/run/dpdk/spdk_pid466605 00:08:44.651 Removing: /var/run/dpdk/spdk_pid466754 00:08:44.651 Removing: /var/run/dpdk/spdk_pid466963 00:08:44.651 Removing: /var/run/dpdk/spdk_pid467231 00:08:44.651 Removing: /var/run/dpdk/spdk_pid467518 00:08:44.651 Removing: /var/run/dpdk/spdk_pid467790 00:08:44.651 Removing: /var/run/dpdk/spdk_pid468072 00:08:44.651 Removing: /var/run/dpdk/spdk_pid468261 00:08:44.651 Removing: /var/run/dpdk/spdk_pid468467 00:08:44.651 Removing: /var/run/dpdk/spdk_pid468663 00:08:44.651 Removing: /var/run/dpdk/spdk_pid468950 00:08:44.651 Removing: /var/run/dpdk/spdk_pid469224 00:08:44.651 Removing: /var/run/dpdk/spdk_pid469510 00:08:44.651 Removing: /var/run/dpdk/spdk_pid469776 00:08:44.651 Removing: /var/run/dpdk/spdk_pid470047 00:08:44.651 Removing: /var/run/dpdk/spdk_pid470130 00:08:44.651 Removing: /var/run/dpdk/spdk_pid470387 00:08:44.651 Removing: /var/run/dpdk/spdk_pid470924 00:08:44.651 Removing: /var/run/dpdk/spdk_pid471486 00:08:44.651 Removing: /var/run/dpdk/spdk_pid471905 00:08:44.651 Removing: /var/run/dpdk/spdk_pid472777 00:08:44.651 Removing: /var/run/dpdk/spdk_pid473397 00:08:44.651 Removing: /var/run/dpdk/spdk_pid473739 00:08:44.651 Removing: /var/run/dpdk/spdk_pid474235 00:08:44.651 Removing: /var/run/dpdk/spdk_pid474778 00:08:44.651 Removing: /var/run/dpdk/spdk_pid475110 00:08:44.651 Removing: /var/run/dpdk/spdk_pid475608 00:08:44.651 Removing: /var/run/dpdk/spdk_pid476092 00:08:44.651 Removing: /var/run/dpdk/spdk_pid476437 00:08:44.651 Removing: /var/run/dpdk/spdk_pid476979 00:08:44.651 Removing: /var/run/dpdk/spdk_pid477351 00:08:44.651 Removing: /var/run/dpdk/spdk_pid477812 00:08:44.651 Removing: /var/run/dpdk/spdk_pid478347 00:08:44.651 Removing: /var/run/dpdk/spdk_pid478649 00:08:44.651 Removing: /var/run/dpdk/spdk_pid479185 00:08:44.651 Removing: /var/run/dpdk/spdk_pid479657 00:08:44.651 Removing: /var/run/dpdk/spdk_pid480025 00:08:44.651 Removing: /var/run/dpdk/spdk_pid480556 00:08:44.651 Removing: /var/run/dpdk/spdk_pid480896 00:08:44.651 Removing: /var/run/dpdk/spdk_pid481393 00:08:44.651 Removing: /var/run/dpdk/spdk_pid481930 00:08:44.651 Removing: /var/run/dpdk/spdk_pid482227 00:08:44.651 Removing: /var/run/dpdk/spdk_pid482834 00:08:44.651 Removing: /var/run/dpdk/spdk_pid483383 00:08:44.651 Removing: /var/run/dpdk/spdk_pid483863 00:08:44.651 Removing: /var/run/dpdk/spdk_pid484226 00:08:44.651 Removing: /var/run/dpdk/spdk_pid484759 00:08:44.651 Removing: /var/run/dpdk/spdk_pid485303 00:08:44.651 Removing: /var/run/dpdk/spdk_pid485849 00:08:44.651 Clean 00:08:44.651 killing process with pid 401656 00:08:48.844 killing process with pid 401653 00:08:48.844 killing process with pid 401655 00:08:48.844 killing process with pid 401654 00:08:48.844 23:55:37 -- common/autotest_common.sh@1436 -- # return 0 00:08:48.844 23:55:37 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:08:48.844 23:55:37 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:48.844 23:55:37 -- common/autotest_common.sh@10 -- # set +x 00:08:48.844 23:55:37 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:08:48.844 23:55:37 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:48.844 23:55:37 -- common/autotest_common.sh@10 -- # set +x 00:08:48.844 23:55:37 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:48.844 23:55:37 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:48.844 23:55:37 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:48.844 23:55:37 -- spdk/autotest.sh@394 -- # hash lcov 00:08:48.844 23:55:37 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:08:48.844 23:55:37 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:48.844 23:55:37 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:08:48.844 23:55:37 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:48.844 23:55:37 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:48.844 23:55:37 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:48.844 23:55:37 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:48.844 23:55:37 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:48.844 23:55:37 -- paths/export.sh@5 -- $ export PATH 00:08:48.844 23:55:37 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:48.844 23:55:37 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:08:48.844 23:55:37 -- common/autobuild_common.sh@435 -- $ date +%s 00:08:48.844 23:55:37 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714082137.XXXXXX 00:08:48.844 23:55:37 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714082137.WYqgks 00:08:48.844 23:55:37 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:08:48.844 23:55:37 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:08:48.844 23:55:37 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:48.844 23:55:37 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:08:48.844 23:55:37 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:08:48.844 23:55:37 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:08:48.844 23:55:37 -- common/autobuild_common.sh@451 -- $ get_config_params 00:08:48.844 23:55:37 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:08:48.844 23:55:37 -- common/autotest_common.sh@10 -- $ set +x 00:08:48.844 23:55:37 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:08:48.844 23:55:37 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:08:48.844 23:55:37 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:48.844 23:55:37 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:08:48.844 23:55:37 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:08:48.844 23:55:37 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:08:48.844 23:55:37 -- spdk/autopackage.sh@19 -- $ timing_finish 00:08:48.844 23:55:37 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:08:48.844 23:55:37 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:08:48.844 23:55:37 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:48.844 23:55:37 -- spdk/autopackage.sh@20 -- $ exit 0 00:08:48.844 + [[ -n 345614 ]] 00:08:48.844 + sudo kill 345614 00:08:48.854 [Pipeline] } 00:08:48.873 [Pipeline] // stage 00:08:48.878 [Pipeline] } 00:08:48.894 [Pipeline] // timeout 00:08:48.900 [Pipeline] } 00:08:48.914 [Pipeline] // catchError 00:08:48.920 [Pipeline] } 00:08:48.936 [Pipeline] // wrap 00:08:48.941 [Pipeline] } 00:08:48.956 [Pipeline] // catchError 00:08:48.964 [Pipeline] stage 00:08:48.967 [Pipeline] { (Epilogue) 00:08:48.982 [Pipeline] catchError 00:08:48.984 [Pipeline] { 00:08:48.998 [Pipeline] echo 00:08:49.000 Cleanup processes 00:08:49.006 [Pipeline] sh 00:08:49.296 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:49.296 494661 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:49.315 [Pipeline] sh 00:08:49.592 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:49.592 ++ grep -v 'sudo pgrep' 00:08:49.592 ++ awk '{print $1}' 00:08:49.592 + sudo kill -9 00:08:49.592 + true 00:08:49.601 [Pipeline] sh 00:08:49.880 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:08:49.880 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:49.880 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:51.267 [Pipeline] sh 00:08:51.552 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:08:51.552 Artifacts sizes are good 00:08:51.565 [Pipeline] archiveArtifacts 00:08:51.572 Archiving artifacts 00:08:51.628 [Pipeline] sh 00:08:51.912 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:08:51.926 [Pipeline] cleanWs 00:08:51.935 [WS-CLEANUP] Deleting project workspace... 00:08:51.935 [WS-CLEANUP] Deferred wipeout is used... 00:08:51.941 [WS-CLEANUP] done 00:08:51.943 [Pipeline] } 00:08:51.962 [Pipeline] // catchError 00:08:51.974 [Pipeline] sh 00:08:52.350 + logger -p user.info -t JENKINS-CI 00:08:52.385 [Pipeline] } 00:08:52.398 [Pipeline] // stage 00:08:52.403 [Pipeline] } 00:08:52.417 [Pipeline] // node 00:08:52.423 [Pipeline] End of Pipeline 00:08:52.460 Finished: SUCCESS